Feb 19 18:28:52 localhost kernel: Linux version 5.14.0-681.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Wed Feb 11 20:19:22 UTC 2026
Feb 19 18:28:52 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Feb 19 18:28:52 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64 root=UUID=9d578f93-c4e9-4172-8459-ef150e54751c ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 19 18:28:52 localhost kernel: BIOS-provided physical RAM map:
Feb 19 18:28:52 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Feb 19 18:28:52 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Feb 19 18:28:52 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Feb 19 18:28:52 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Feb 19 18:28:52 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Feb 19 18:28:52 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Feb 19 18:28:52 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Feb 19 18:28:52 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Feb 19 18:28:52 localhost kernel: NX (Execute Disable) protection: active
Feb 19 18:28:52 localhost kernel: APIC: Static calls initialized
Feb 19 18:28:52 localhost kernel: SMBIOS 2.8 present.
Feb 19 18:28:52 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Feb 19 18:28:52 localhost kernel: Hypervisor detected: KVM
Feb 19 18:28:52 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Feb 19 18:28:52 localhost kernel: kvm-clock: using sched offset of 8566819525 cycles
Feb 19 18:28:52 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Feb 19 18:28:52 localhost kernel: tsc: Detected 2799.998 MHz processor
Feb 19 18:28:52 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Feb 19 18:28:52 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Feb 19 18:28:52 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Feb 19 18:28:52 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Feb 19 18:28:52 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Feb 19 18:28:52 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Feb 19 18:28:52 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Feb 19 18:28:52 localhost kernel: Using GB pages for direct mapping
Feb 19 18:28:52 localhost kernel: RAMDISK: [mem 0x1b6f6000-0x29b72fff]
Feb 19 18:28:52 localhost kernel: ACPI: Early table checksum verification disabled
Feb 19 18:28:52 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Feb 19 18:28:52 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 19 18:28:52 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 19 18:28:52 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 19 18:28:52 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Feb 19 18:28:52 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 19 18:28:52 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 19 18:28:52 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Feb 19 18:28:52 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Feb 19 18:28:52 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Feb 19 18:28:52 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Feb 19 18:28:52 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Feb 19 18:28:52 localhost kernel: No NUMA configuration found
Feb 19 18:28:52 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Feb 19 18:28:52 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Feb 19 18:28:52 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Feb 19 18:28:52 localhost kernel: Zone ranges:
Feb 19 18:28:52 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Feb 19 18:28:52 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Feb 19 18:28:52 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Feb 19 18:28:52 localhost kernel:   Device   empty
Feb 19 18:28:52 localhost kernel: Movable zone start for each node
Feb 19 18:28:52 localhost kernel: Early memory node ranges
Feb 19 18:28:52 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Feb 19 18:28:52 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Feb 19 18:28:52 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Feb 19 18:28:52 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Feb 19 18:28:52 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Feb 19 18:28:52 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Feb 19 18:28:52 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Feb 19 18:28:52 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Feb 19 18:28:52 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Feb 19 18:28:52 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Feb 19 18:28:52 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Feb 19 18:28:52 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Feb 19 18:28:52 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Feb 19 18:28:52 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Feb 19 18:28:52 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Feb 19 18:28:52 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Feb 19 18:28:52 localhost kernel: TSC deadline timer available
Feb 19 18:28:52 localhost kernel: CPU topo: Max. logical packages:   8
Feb 19 18:28:52 localhost kernel: CPU topo: Max. logical dies:       8
Feb 19 18:28:52 localhost kernel: CPU topo: Max. dies per package:   1
Feb 19 18:28:52 localhost kernel: CPU topo: Max. threads per core:   1
Feb 19 18:28:52 localhost kernel: CPU topo: Num. cores per package:     1
Feb 19 18:28:52 localhost kernel: CPU topo: Num. threads per package:   1
Feb 19 18:28:52 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Feb 19 18:28:52 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Feb 19 18:28:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Feb 19 18:28:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Feb 19 18:28:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Feb 19 18:28:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Feb 19 18:28:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Feb 19 18:28:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Feb 19 18:28:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Feb 19 18:28:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Feb 19 18:28:52 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Feb 19 18:28:52 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Feb 19 18:28:52 localhost kernel: Booting paravirtualized kernel on KVM
Feb 19 18:28:52 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Feb 19 18:28:52 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Feb 19 18:28:52 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Feb 19 18:28:52 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Feb 19 18:28:52 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Feb 19 18:28:52 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Feb 19 18:28:52 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64 root=UUID=9d578f93-c4e9-4172-8459-ef150e54751c ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 19 18:28:52 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64", will be passed to user space.
Feb 19 18:28:52 localhost kernel: random: crng init done
Feb 19 18:28:52 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Feb 19 18:28:52 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Feb 19 18:28:52 localhost kernel: Fallback order for Node 0: 0 
Feb 19 18:28:52 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Feb 19 18:28:52 localhost kernel: Policy zone: Normal
Feb 19 18:28:52 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Feb 19 18:28:52 localhost kernel: software IO TLB: area num 8.
Feb 19 18:28:52 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Feb 19 18:28:52 localhost kernel: ftrace: allocating 49565 entries in 194 pages
Feb 19 18:28:52 localhost kernel: ftrace: allocated 194 pages with 3 groups
Feb 19 18:28:52 localhost kernel: Dynamic Preempt: voluntary
Feb 19 18:28:52 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Feb 19 18:28:52 localhost kernel: rcu:         RCU event tracing is enabled.
Feb 19 18:28:52 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Feb 19 18:28:52 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Feb 19 18:28:52 localhost kernel:         Rude variant of Tasks RCU enabled.
Feb 19 18:28:52 localhost kernel:         Tracing variant of Tasks RCU enabled.
Feb 19 18:28:52 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Feb 19 18:28:52 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Feb 19 18:28:52 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 19 18:28:52 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 19 18:28:52 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 19 18:28:52 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Feb 19 18:28:52 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Feb 19 18:28:52 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Feb 19 18:28:52 localhost kernel: Console: colour VGA+ 80x25
Feb 19 18:28:52 localhost kernel: printk: console [ttyS0] enabled
Feb 19 18:28:52 localhost kernel: ACPI: Core revision 20230331
Feb 19 18:28:52 localhost kernel: APIC: Switch to symmetric I/O mode setup
Feb 19 18:28:52 localhost kernel: x2apic enabled
Feb 19 18:28:52 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Feb 19 18:28:52 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Feb 19 18:28:52 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Feb 19 18:28:52 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Feb 19 18:28:52 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Feb 19 18:28:52 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Feb 19 18:28:52 localhost kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Feb 19 18:28:52 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Feb 19 18:28:52 localhost kernel: Spectre V2 : Mitigation: Retpolines
Feb 19 18:28:52 localhost kernel: RETBleed: Mitigation: untrained return thunk
Feb 19 18:28:52 localhost kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Feb 19 18:28:52 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Feb 19 18:28:52 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Feb 19 18:28:52 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Feb 19 18:28:52 localhost kernel: active return thunk: retbleed_return_thunk
Feb 19 18:28:52 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Feb 19 18:28:52 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Feb 19 18:28:52 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Feb 19 18:28:52 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Feb 19 18:28:52 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Feb 19 18:28:52 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Feb 19 18:28:52 localhost kernel: Freeing SMP alternatives memory: 40K
Feb 19 18:28:52 localhost kernel: pid_max: default: 32768 minimum: 301
Feb 19 18:28:52 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Feb 19 18:28:52 localhost kernel: landlock: Up and running.
Feb 19 18:28:52 localhost kernel: Yama: becoming mindful.
Feb 19 18:28:52 localhost kernel: SELinux:  Initializing.
Feb 19 18:28:52 localhost kernel: LSM support for eBPF active
Feb 19 18:28:52 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 19 18:28:52 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 19 18:28:52 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Feb 19 18:28:52 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Feb 19 18:28:52 localhost kernel: ... version:                0
Feb 19 18:28:52 localhost kernel: ... bit width:              48
Feb 19 18:28:52 localhost kernel: ... generic registers:      6
Feb 19 18:28:52 localhost kernel: ... value mask:             0000ffffffffffff
Feb 19 18:28:52 localhost kernel: ... max period:             00007fffffffffff
Feb 19 18:28:52 localhost kernel: ... fixed-purpose events:   0
Feb 19 18:28:52 localhost kernel: ... event mask:             000000000000003f
Feb 19 18:28:52 localhost kernel: signal: max sigframe size: 1776
Feb 19 18:28:52 localhost kernel: rcu: Hierarchical SRCU implementation.
Feb 19 18:28:52 localhost kernel: rcu:         Max phase no-delay instances is 400.
Feb 19 18:28:52 localhost kernel: smp: Bringing up secondary CPUs ...
Feb 19 18:28:52 localhost kernel: smpboot: x86: Booting SMP configuration:
Feb 19 18:28:52 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Feb 19 18:28:52 localhost kernel: smp: Brought up 1 node, 8 CPUs
Feb 19 18:28:52 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Feb 19 18:28:52 localhost kernel: node 0 deferred pages initialised in 10ms
Feb 19 18:28:52 localhost kernel: Memory: 7617592K/8388068K available (16384K kernel code, 5795K rwdata, 13948K rodata, 4204K init, 7180K bss, 764368K reserved, 0K cma-reserved)
Feb 19 18:28:52 localhost kernel: devtmpfs: initialized
Feb 19 18:28:52 localhost kernel: x86/mm: Memory block size: 128MB
Feb 19 18:28:52 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Feb 19 18:28:52 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Feb 19 18:28:52 localhost kernel: pinctrl core: initialized pinctrl subsystem
Feb 19 18:28:52 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Feb 19 18:28:52 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Feb 19 18:28:52 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Feb 19 18:28:52 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Feb 19 18:28:52 localhost kernel: audit: initializing netlink subsys (disabled)
Feb 19 18:28:52 localhost kernel: audit: type=2000 audit(1771525731.273:1): state=initialized audit_enabled=0 res=1
Feb 19 18:28:52 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Feb 19 18:28:52 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Feb 19 18:28:52 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Feb 19 18:28:52 localhost kernel: cpuidle: using governor menu
Feb 19 18:28:52 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Feb 19 18:28:52 localhost kernel: PCI: Using configuration type 1 for base access
Feb 19 18:28:52 localhost kernel: PCI: Using configuration type 1 for extended access
Feb 19 18:28:52 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Feb 19 18:28:52 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Feb 19 18:28:52 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Feb 19 18:28:52 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Feb 19 18:28:52 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Feb 19 18:28:52 localhost kernel: Demotion targets for Node 0: null
Feb 19 18:28:52 localhost kernel: cryptd: max_cpu_qlen set to 1000
Feb 19 18:28:52 localhost kernel: ACPI: Added _OSI(Module Device)
Feb 19 18:28:52 localhost kernel: ACPI: Added _OSI(Processor Device)
Feb 19 18:28:52 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Feb 19 18:28:52 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Feb 19 18:28:52 localhost kernel: ACPI: Interpreter enabled
Feb 19 18:28:52 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Feb 19 18:28:52 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Feb 19 18:28:52 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Feb 19 18:28:52 localhost kernel: PCI: Using E820 reservations for host bridge windows
Feb 19 18:28:52 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Feb 19 18:28:52 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Feb 19 18:28:52 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [3] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [4] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [5] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [6] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [7] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [8] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [9] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [10] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [11] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [12] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [13] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [14] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [15] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [16] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [17] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [18] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [19] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [20] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [21] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [22] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [23] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [24] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [25] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [26] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [27] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [28] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [29] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [30] registered
Feb 19 18:28:52 localhost kernel: acpiphp: Slot [31] registered
Feb 19 18:28:52 localhost kernel: PCI host bridge to bus 0000:00
Feb 19 18:28:52 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Feb 19 18:28:52 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Feb 19 18:28:52 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Feb 19 18:28:52 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Feb 19 18:28:52 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Feb 19 18:28:52 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Feb 19 18:28:52 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Feb 19 18:28:52 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Feb 19 18:28:52 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Feb 19 18:28:52 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Feb 19 18:28:52 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Feb 19 18:28:52 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Feb 19 18:28:52 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Feb 19 18:28:52 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 19 18:28:52 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Feb 19 18:28:52 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Feb 19 18:28:52 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Feb 19 18:28:52 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Feb 19 18:28:52 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Feb 19 18:28:52 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Feb 19 18:28:52 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Feb 19 18:28:52 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Feb 19 18:28:52 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Feb 19 18:28:52 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Feb 19 18:28:52 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Feb 19 18:28:52 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Feb 19 18:28:52 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Feb 19 18:28:52 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Feb 19 18:28:52 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Feb 19 18:28:52 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Feb 19 18:28:52 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Feb 19 18:28:52 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Feb 19 18:28:52 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Feb 19 18:28:52 localhost kernel: iommu: Default domain type: Translated
Feb 19 18:28:52 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Feb 19 18:28:52 localhost kernel: SCSI subsystem initialized
Feb 19 18:28:52 localhost kernel: ACPI: bus type USB registered
Feb 19 18:28:52 localhost kernel: usbcore: registered new interface driver usbfs
Feb 19 18:28:52 localhost kernel: usbcore: registered new interface driver hub
Feb 19 18:28:52 localhost kernel: usbcore: registered new device driver usb
Feb 19 18:28:52 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Feb 19 18:28:52 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Feb 19 18:28:52 localhost kernel: PTP clock support registered
Feb 19 18:28:52 localhost kernel: EDAC MC: Ver: 3.0.0
Feb 19 18:28:52 localhost kernel: NetLabel: Initializing
Feb 19 18:28:52 localhost kernel: NetLabel:  domain hash size = 128
Feb 19 18:28:52 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Feb 19 18:28:52 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Feb 19 18:28:52 localhost kernel: PCI: Using ACPI for IRQ routing
Feb 19 18:28:52 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Feb 19 18:28:52 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Feb 19 18:28:52 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Feb 19 18:28:52 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Feb 19 18:28:52 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Feb 19 18:28:52 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Feb 19 18:28:52 localhost kernel: vgaarb: loaded
Feb 19 18:28:52 localhost kernel: clocksource: Switched to clocksource kvm-clock
Feb 19 18:28:52 localhost kernel: VFS: Disk quotas dquot_6.6.0
Feb 19 18:28:52 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Feb 19 18:28:52 localhost kernel: pnp: PnP ACPI init
Feb 19 18:28:52 localhost kernel: pnp 00:03: [dma 2]
Feb 19 18:28:52 localhost kernel: pnp: PnP ACPI: found 5 devices
Feb 19 18:28:52 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Feb 19 18:28:52 localhost kernel: NET: Registered PF_INET protocol family
Feb 19 18:28:52 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Feb 19 18:28:52 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Feb 19 18:28:52 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Feb 19 18:28:52 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Feb 19 18:28:52 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Feb 19 18:28:52 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Feb 19 18:28:52 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Feb 19 18:28:52 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 19 18:28:52 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 19 18:28:52 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Feb 19 18:28:52 localhost kernel: NET: Registered PF_XDP protocol family
Feb 19 18:28:52 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Feb 19 18:28:52 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Feb 19 18:28:52 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Feb 19 18:28:52 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Feb 19 18:28:52 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Feb 19 18:28:52 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Feb 19 18:28:52 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Feb 19 18:28:52 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 58780 usecs
Feb 19 18:28:52 localhost kernel: PCI: CLS 0 bytes, default 64
Feb 19 18:28:52 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Feb 19 18:28:52 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Feb 19 18:28:52 localhost kernel: ACPI: bus type thunderbolt registered
Feb 19 18:28:52 localhost kernel: Trying to unpack rootfs image as initramfs...
Feb 19 18:28:52 localhost kernel: Initialise system trusted keyrings
Feb 19 18:28:52 localhost kernel: Key type blacklist registered
Feb 19 18:28:52 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Feb 19 18:28:52 localhost kernel: zbud: loaded
Feb 19 18:28:52 localhost kernel: integrity: Platform Keyring initialized
Feb 19 18:28:52 localhost kernel: integrity: Machine keyring initialized
Feb 19 18:28:52 localhost kernel: Freeing initrd memory: 233972K
Feb 19 18:28:52 localhost kernel: NET: Registered PF_ALG protocol family
Feb 19 18:28:52 localhost kernel: xor: automatically using best checksumming function   avx       
Feb 19 18:28:52 localhost kernel: Key type asymmetric registered
Feb 19 18:28:52 localhost kernel: Asymmetric key parser 'x509' registered
Feb 19 18:28:52 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Feb 19 18:28:52 localhost kernel: io scheduler mq-deadline registered
Feb 19 18:28:52 localhost kernel: io scheduler kyber registered
Feb 19 18:28:52 localhost kernel: io scheduler bfq registered
Feb 19 18:28:52 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Feb 19 18:28:52 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Feb 19 18:28:52 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Feb 19 18:28:52 localhost kernel: ACPI: button: Power Button [PWRF]
Feb 19 18:28:52 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Feb 19 18:28:52 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Feb 19 18:28:52 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Feb 19 18:28:52 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Feb 19 18:28:52 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Feb 19 18:28:52 localhost kernel: Non-volatile memory driver v1.3
Feb 19 18:28:52 localhost kernel: rdac: device handler registered
Feb 19 18:28:52 localhost kernel: hp_sw: device handler registered
Feb 19 18:28:52 localhost kernel: emc: device handler registered
Feb 19 18:28:52 localhost kernel: alua: device handler registered
Feb 19 18:28:52 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Feb 19 18:28:52 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Feb 19 18:28:52 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Feb 19 18:28:52 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Feb 19 18:28:52 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Feb 19 18:28:52 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Feb 19 18:28:52 localhost kernel: usb usb1: Product: UHCI Host Controller
Feb 19 18:28:52 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-681.el9.x86_64 uhci_hcd
Feb 19 18:28:52 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Feb 19 18:28:52 localhost kernel: hub 1-0:1.0: USB hub found
Feb 19 18:28:52 localhost kernel: hub 1-0:1.0: 2 ports detected
Feb 19 18:28:52 localhost kernel: usbcore: registered new interface driver usbserial_generic
Feb 19 18:28:52 localhost kernel: usbserial: USB Serial support registered for generic
Feb 19 18:28:52 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Feb 19 18:28:52 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Feb 19 18:28:52 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Feb 19 18:28:52 localhost kernel: mousedev: PS/2 mouse device common for all mice
Feb 19 18:28:52 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Feb 19 18:28:52 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Feb 19 18:28:52 localhost kernel: rtc_cmos 00:04: registered as rtc0
Feb 19 18:28:52 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-19T18:28:51 UTC (1771525731)
Feb 19 18:28:52 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Feb 19 18:28:52 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Feb 19 18:28:52 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Feb 19 18:28:52 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Feb 19 18:28:52 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Feb 19 18:28:52 localhost kernel: usbcore: registered new interface driver usbhid
Feb 19 18:28:52 localhost kernel: usbhid: USB HID core driver
Feb 19 18:28:52 localhost kernel: drop_monitor: Initializing network drop monitor service
Feb 19 18:28:52 localhost kernel: Initializing XFRM netlink socket
Feb 19 18:28:52 localhost kernel: NET: Registered PF_INET6 protocol family
Feb 19 18:28:52 localhost kernel: Segment Routing with IPv6
Feb 19 18:28:52 localhost kernel: NET: Registered PF_PACKET protocol family
Feb 19 18:28:52 localhost kernel: mpls_gso: MPLS GSO support
Feb 19 18:28:52 localhost kernel: IPI shorthand broadcast: enabled
Feb 19 18:28:52 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Feb 19 18:28:52 localhost kernel: AES CTR mode by8 optimization enabled
Feb 19 18:28:52 localhost kernel: sched_clock: Marking stable (1184015165, 150295437)->(1408096366, -73785764)
Feb 19 18:28:52 localhost kernel: registered taskstats version 1
Feb 19 18:28:52 localhost kernel: Loading compiled-in X.509 certificates
Feb 19 18:28:52 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4fde1469d8033882223ec575c1a1c1b88c9a497b'
Feb 19 18:28:52 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Feb 19 18:28:52 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Feb 19 18:28:52 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Feb 19 18:28:52 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Feb 19 18:28:52 localhost kernel: Demotion targets for Node 0: null
Feb 19 18:28:52 localhost kernel: page_owner is disabled
Feb 19 18:28:52 localhost kernel: Key type .fscrypt registered
Feb 19 18:28:52 localhost kernel: Key type fscrypt-provisioning registered
Feb 19 18:28:52 localhost kernel: Key type big_key registered
Feb 19 18:28:52 localhost kernel: Key type encrypted registered
Feb 19 18:28:52 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Feb 19 18:28:52 localhost kernel: Loading compiled-in module X.509 certificates
Feb 19 18:28:52 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4fde1469d8033882223ec575c1a1c1b88c9a497b'
Feb 19 18:28:52 localhost kernel: ima: Allocated hash algorithm: sha256
Feb 19 18:28:52 localhost kernel: ima: No architecture policies found
Feb 19 18:28:52 localhost kernel: evm: Initialising EVM extended attributes:
Feb 19 18:28:52 localhost kernel: evm: security.selinux
Feb 19 18:28:52 localhost kernel: evm: security.SMACK64 (disabled)
Feb 19 18:28:52 localhost kernel: evm: security.SMACK64EXEC (disabled)
Feb 19 18:28:52 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Feb 19 18:28:52 localhost kernel: evm: security.SMACK64MMAP (disabled)
Feb 19 18:28:52 localhost kernel: evm: security.apparmor (disabled)
Feb 19 18:28:52 localhost kernel: evm: security.ima
Feb 19 18:28:52 localhost kernel: evm: security.capability
Feb 19 18:28:52 localhost kernel: evm: HMAC attrs: 0x1
Feb 19 18:28:52 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Feb 19 18:28:52 localhost kernel: Running certificate verification RSA selftest
Feb 19 18:28:52 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Feb 19 18:28:52 localhost kernel: Running certificate verification ECDSA selftest
Feb 19 18:28:52 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Feb 19 18:28:52 localhost kernel: clk: Disabling unused clocks
Feb 19 18:28:52 localhost kernel: Freeing unused decrypted memory: 2028K
Feb 19 18:28:52 localhost kernel: Freeing unused kernel image (initmem) memory: 4204K
Feb 19 18:28:52 localhost kernel: Write protecting the kernel read-only data: 30720k
Feb 19 18:28:52 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 388K
Feb 19 18:28:52 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Feb 19 18:28:52 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Feb 19 18:28:52 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Feb 19 18:28:52 localhost kernel: usb 1-1: Manufacturer: QEMU
Feb 19 18:28:52 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Feb 19 18:28:52 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Feb 19 18:28:52 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Feb 19 18:28:52 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Feb 19 18:28:52 localhost kernel: Run /init as init process
Feb 19 18:28:52 localhost kernel:   with arguments:
Feb 19 18:28:52 localhost kernel:     /init
Feb 19 18:28:52 localhost kernel:   with environment:
Feb 19 18:28:52 localhost kernel:     HOME=/
Feb 19 18:28:52 localhost kernel:     TERM=linux
Feb 19 18:28:52 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64
Feb 19 18:28:52 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 19 18:28:52 localhost systemd[1]: Detected virtualization kvm.
Feb 19 18:28:52 localhost systemd[1]: Detected architecture x86-64.
Feb 19 18:28:52 localhost systemd[1]: Running in initrd.
Feb 19 18:28:52 localhost systemd[1]: No hostname configured, using default hostname.
Feb 19 18:28:52 localhost systemd[1]: Hostname set to <localhost>.
Feb 19 18:28:52 localhost systemd[1]: Initializing machine ID from VM UUID.
Feb 19 18:28:52 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Feb 19 18:28:52 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 19 18:28:52 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 19 18:28:52 localhost systemd[1]: Reached target Initrd /usr File System.
Feb 19 18:28:52 localhost systemd[1]: Reached target Local File Systems.
Feb 19 18:28:52 localhost systemd[1]: Reached target Path Units.
Feb 19 18:28:52 localhost systemd[1]: Reached target Slice Units.
Feb 19 18:28:52 localhost systemd[1]: Reached target Swaps.
Feb 19 18:28:52 localhost systemd[1]: Reached target Timer Units.
Feb 19 18:28:52 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 19 18:28:52 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Feb 19 18:28:52 localhost systemd[1]: Listening on Journal Socket.
Feb 19 18:28:52 localhost systemd[1]: Listening on udev Control Socket.
Feb 19 18:28:52 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 19 18:28:52 localhost systemd[1]: Reached target Socket Units.
Feb 19 18:28:52 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 19 18:28:52 localhost systemd[1]: Starting Journal Service...
Feb 19 18:28:52 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 19 18:28:52 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 19 18:28:52 localhost systemd[1]: Starting Create System Users...
Feb 19 18:28:52 localhost systemd[1]: Starting Setup Virtual Console...
Feb 19 18:28:52 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 19 18:28:52 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 19 18:28:52 localhost systemd[1]: Finished Create System Users.
Feb 19 18:28:52 localhost systemd-journald[306]: Journal started
Feb 19 18:28:52 localhost systemd-journald[306]: Runtime Journal (/run/log/journal/30a688d16db846798c84eeb6a5211fb8) is 8.0M, max 153.6M, 145.6M free.
Feb 19 18:28:52 localhost systemd-sysusers[311]: Creating group 'users' with GID 100.
Feb 19 18:28:52 localhost systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Feb 19 18:28:52 localhost systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Feb 19 18:28:52 localhost systemd[1]: Started Journal Service.
Feb 19 18:28:52 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 19 18:28:52 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 19 18:28:52 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 19 18:28:52 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 19 18:28:52 localhost systemd[1]: Finished Setup Virtual Console.
Feb 19 18:28:52 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Feb 19 18:28:52 localhost systemd[1]: Starting dracut cmdline hook...
Feb 19 18:28:52 localhost dracut-cmdline[325]: dracut-9 dracut-057-110.git20260130.el9
Feb 19 18:28:52 localhost dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64 root=UUID=9d578f93-c4e9-4172-8459-ef150e54751c ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 19 18:28:52 localhost systemd[1]: Finished dracut cmdline hook.
Feb 19 18:28:52 localhost systemd[1]: Starting dracut pre-udev hook...
Feb 19 18:28:52 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Feb 19 18:28:52 localhost kernel: device-mapper: uevent: version 1.0.3
Feb 19 18:28:52 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Feb 19 18:28:52 localhost kernel: RPC: Registered named UNIX socket transport module.
Feb 19 18:28:52 localhost kernel: RPC: Registered udp transport module.
Feb 19 18:28:52 localhost kernel: RPC: Registered tcp transport module.
Feb 19 18:28:52 localhost kernel: RPC: Registered tcp-with-tls transport module.
Feb 19 18:28:52 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Feb 19 18:28:52 localhost rpc.statd[441]: Version 2.5.4 starting
Feb 19 18:28:52 localhost rpc.statd[441]: Initializing NSM state
Feb 19 18:28:52 localhost rpc.idmapd[446]: Setting log level to 0
Feb 19 18:28:52 localhost systemd[1]: Finished dracut pre-udev hook.
Feb 19 18:28:52 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 19 18:28:52 localhost systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Feb 19 18:28:53 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 19 18:28:53 localhost systemd[1]: Starting dracut pre-trigger hook...
Feb 19 18:28:53 localhost systemd[1]: Finished dracut pre-trigger hook.
Feb 19 18:28:53 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 19 18:28:53 localhost systemd[1]: Created slice Slice /system/modprobe.
Feb 19 18:28:53 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 19 18:28:53 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 19 18:28:53 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 19 18:28:53 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 19 18:28:53 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 19 18:28:53 localhost systemd[1]: Reached target Network.
Feb 19 18:28:53 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 19 18:28:53 localhost systemd[1]: Starting dracut initqueue hook...
Feb 19 18:28:53 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Feb 19 18:28:53 localhost systemd-udevd[462]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 18:28:53 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Feb 19 18:28:53 localhost kernel:  vda: vda1
Feb 19 18:28:53 localhost kernel: libata version 3.00 loaded.
Feb 19 18:28:53 localhost systemd[1]: Mounting Kernel Configuration File System...
Feb 19 18:28:53 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Feb 19 18:28:53 localhost kernel: ACPI: bus type drm_connector registered
Feb 19 18:28:53 localhost kernel: scsi host0: ata_piix
Feb 19 18:28:53 localhost kernel: scsi host1: ata_piix
Feb 19 18:28:53 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Feb 19 18:28:53 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Feb 19 18:28:53 localhost systemd[1]: Mounted Kernel Configuration File System.
Feb 19 18:28:53 localhost systemd[1]: Found device /dev/disk/by-uuid/9d578f93-c4e9-4172-8459-ef150e54751c.
Feb 19 18:28:53 localhost systemd[1]: Reached target Initrd Root Device.
Feb 19 18:28:53 localhost systemd[1]: Reached target System Initialization.
Feb 19 18:28:53 localhost systemd[1]: Reached target Basic System.
Feb 19 18:28:53 localhost kernel: ata1: found unknown device (class 0)
Feb 19 18:28:53 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Feb 19 18:28:53 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Feb 19 18:28:53 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Feb 19 18:28:53 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Feb 19 18:28:53 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Feb 19 18:28:53 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Feb 19 18:28:53 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Feb 19 18:28:53 localhost kernel: Console: switching to colour dummy device 80x25
Feb 19 18:28:53 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Feb 19 18:28:53 localhost kernel: [drm] features: -context_init
Feb 19 18:28:53 localhost kernel: [drm] number of scanouts: 1
Feb 19 18:28:53 localhost kernel: [drm] number of cap sets: 0
Feb 19 18:28:53 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Feb 19 18:28:53 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Feb 19 18:28:53 localhost kernel: Console: switching to colour frame buffer device 128x48
Feb 19 18:28:53 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Feb 19 18:28:53 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Feb 19 18:28:53 localhost systemd[1]: Finished dracut initqueue hook.
Feb 19 18:28:53 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Feb 19 18:28:53 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Feb 19 18:28:53 localhost systemd[1]: Reached target Remote File Systems.
Feb 19 18:28:53 localhost systemd[1]: Starting dracut pre-mount hook...
Feb 19 18:28:53 localhost systemd[1]: Finished dracut pre-mount hook.
Feb 19 18:28:53 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/9d578f93-c4e9-4172-8459-ef150e54751c...
Feb 19 18:28:53 localhost systemd-fsck[565]: /usr/sbin/fsck.xfs: XFS file system.
Feb 19 18:28:53 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/9d578f93-c4e9-4172-8459-ef150e54751c.
Feb 19 18:28:53 localhost systemd[1]: Mounting /sysroot...
Feb 19 18:28:54 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Feb 19 18:28:54 localhost kernel: XFS (vda1): Mounting V5 Filesystem 9d578f93-c4e9-4172-8459-ef150e54751c
Feb 19 18:28:54 localhost kernel: XFS (vda1): Ending clean mount
Feb 19 18:28:54 localhost systemd[1]: Mounted /sysroot.
Feb 19 18:28:54 localhost systemd[1]: Reached target Initrd Root File System.
Feb 19 18:28:54 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Feb 19 18:28:54 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Feb 19 18:28:54 localhost systemd[1]: Reached target Initrd File Systems.
Feb 19 18:28:54 localhost systemd[1]: Reached target Initrd Default Target.
Feb 19 18:28:54 localhost systemd[1]: Starting dracut mount hook...
Feb 19 18:28:54 localhost systemd[1]: Finished dracut mount hook.
Feb 19 18:28:54 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Feb 19 18:28:54 localhost rpc.idmapd[446]: exiting on signal 15
Feb 19 18:28:54 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Feb 19 18:28:54 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Feb 19 18:28:54 localhost systemd[1]: Stopped target Network.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Timer Units.
Feb 19 18:28:54 localhost systemd[1]: dbus.socket: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Feb 19 18:28:54 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Initrd Default Target.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Basic System.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Initrd Root Device.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Initrd /usr File System.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Path Units.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Remote File Systems.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Slice Units.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Socket Units.
Feb 19 18:28:54 localhost systemd[1]: Stopped target System Initialization.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Local File Systems.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Swaps.
Feb 19 18:28:54 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped dracut mount hook.
Feb 19 18:28:54 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped dracut pre-mount hook.
Feb 19 18:28:54 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Feb 19 18:28:54 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Feb 19 18:28:54 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped dracut initqueue hook.
Feb 19 18:28:54 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped Apply Kernel Variables.
Feb 19 18:28:54 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Feb 19 18:28:54 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped Coldplug All udev Devices.
Feb 19 18:28:54 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped dracut pre-trigger hook.
Feb 19 18:28:54 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 19 18:28:54 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped Setup Virtual Console.
Feb 19 18:28:54 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Feb 19 18:28:54 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 19 18:28:54 localhost systemd[1]: systemd-udevd.service: Consumed 1.118s CPU time.
Feb 19 18:28:54 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Closed udev Control Socket.
Feb 19 18:28:54 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Closed udev Kernel Socket.
Feb 19 18:28:54 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped dracut pre-udev hook.
Feb 19 18:28:54 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped dracut cmdline hook.
Feb 19 18:28:54 localhost systemd[1]: Starting Cleanup udev Database...
Feb 19 18:28:54 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Feb 19 18:28:54 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Feb 19 18:28:54 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Stopped Create System Users.
Feb 19 18:28:54 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Feb 19 18:28:54 localhost systemd[1]: Finished Cleanup udev Database.
Feb 19 18:28:54 localhost systemd[1]: Reached target Switch Root.
Feb 19 18:28:54 localhost systemd[1]: Starting Switch Root...
Feb 19 18:28:54 localhost systemd[1]: Switching root.
Feb 19 18:28:54 localhost systemd-journald[306]: Received SIGTERM from PID 1 (systemd).
Feb 19 18:28:54 localhost systemd-journald[306]: Journal stopped
Feb 19 18:28:55 localhost kernel: audit: type=1404 audit(1771525734.666:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Feb 19 18:28:55 localhost kernel: SELinux:  policy capability network_peer_controls=1
Feb 19 18:28:55 localhost kernel: SELinux:  policy capability open_perms=1
Feb 19 18:28:55 localhost kernel: SELinux:  policy capability extended_socket_class=1
Feb 19 18:28:55 localhost kernel: SELinux:  policy capability always_check_network=0
Feb 19 18:28:55 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 19 18:28:55 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 19 18:28:55 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 19 18:28:55 localhost kernel: audit: type=1403 audit(1771525734.782:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Feb 19 18:28:55 localhost systemd[1]: Successfully loaded SELinux policy in 118.249ms.
Feb 19 18:28:55 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 31.046ms.
Feb 19 18:28:55 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 19 18:28:55 localhost systemd[1]: Detected virtualization kvm.
Feb 19 18:28:55 localhost systemd[1]: Detected architecture x86-64.
Feb 19 18:28:55 localhost systemd-rc-local-generator[648]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 18:28:55 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Feb 19 18:28:55 localhost systemd[1]: Stopped Switch Root.
Feb 19 18:28:55 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Feb 19 18:28:55 localhost systemd[1]: Created slice Slice /system/getty.
Feb 19 18:28:55 localhost systemd[1]: Created slice Slice /system/serial-getty.
Feb 19 18:28:55 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Feb 19 18:28:55 localhost systemd[1]: Created slice User and Session Slice.
Feb 19 18:28:55 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 19 18:28:55 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Feb 19 18:28:55 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Feb 19 18:28:55 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 19 18:28:55 localhost systemd[1]: Stopped target Switch Root.
Feb 19 18:28:55 localhost systemd[1]: Stopped target Initrd File Systems.
Feb 19 18:28:55 localhost systemd[1]: Stopped target Initrd Root File System.
Feb 19 18:28:55 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Feb 19 18:28:55 localhost systemd[1]: Reached target Path Units.
Feb 19 18:28:55 localhost systemd[1]: Reached target rpc_pipefs.target.
Feb 19 18:28:55 localhost systemd[1]: Reached target Slice Units.
Feb 19 18:28:55 localhost systemd[1]: Reached target Swaps.
Feb 19 18:28:55 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Feb 19 18:28:55 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Feb 19 18:28:55 localhost systemd[1]: Reached target RPC Port Mapper.
Feb 19 18:28:55 localhost systemd[1]: Listening on Process Core Dump Socket.
Feb 19 18:28:55 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Feb 19 18:28:55 localhost systemd[1]: Listening on udev Control Socket.
Feb 19 18:28:55 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 19 18:28:55 localhost systemd[1]: Mounting Huge Pages File System...
Feb 19 18:28:55 localhost systemd[1]: Mounting POSIX Message Queue File System...
Feb 19 18:28:55 localhost systemd[1]: Mounting Kernel Debug File System...
Feb 19 18:28:55 localhost systemd[1]: Mounting Kernel Trace File System...
Feb 19 18:28:55 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 19 18:28:55 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 19 18:28:55 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 19 18:28:55 localhost systemd[1]: Starting Load Kernel Module drm...
Feb 19 18:28:55 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Feb 19 18:28:55 localhost systemd[1]: Starting Load Kernel Module fuse...
Feb 19 18:28:55 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Feb 19 18:28:55 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Feb 19 18:28:55 localhost systemd[1]: Stopped File System Check on Root Device.
Feb 19 18:28:55 localhost systemd[1]: Stopped Journal Service.
Feb 19 18:28:55 localhost systemd[1]: Starting Journal Service...
Feb 19 18:28:55 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 19 18:28:55 localhost systemd[1]: Starting Generate network units from Kernel command line...
Feb 19 18:28:55 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 19 18:28:55 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Feb 19 18:28:55 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Feb 19 18:28:55 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 19 18:28:55 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Feb 19 18:28:55 localhost kernel: fuse: init (API version 7.37)
Feb 19 18:28:55 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 19 18:28:55 localhost systemd[1]: Mounted Huge Pages File System.
Feb 19 18:28:55 localhost systemd[1]: Mounted POSIX Message Queue File System.
Feb 19 18:28:55 localhost systemd[1]: Mounted Kernel Debug File System.
Feb 19 18:28:55 localhost systemd-journald[696]: Journal started
Feb 19 18:28:55 localhost systemd-journald[696]: Runtime Journal (/run/log/journal/621707288f5710e1387f73ed6d90e964) is 8.0M, max 153.6M, 145.6M free.
Feb 19 18:28:55 localhost systemd[1]: Queued start job for default target Multi-User System.
Feb 19 18:28:55 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 19 18:28:55 localhost systemd[1]: Started Journal Service.
Feb 19 18:28:55 localhost systemd[1]: Mounted Kernel Trace File System.
Feb 19 18:28:55 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 19 18:28:55 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 19 18:28:55 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 19 18:28:55 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Feb 19 18:28:55 localhost systemd[1]: Finished Load Kernel Module drm.
Feb 19 18:28:55 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Feb 19 18:28:55 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Feb 19 18:28:55 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Feb 19 18:28:55 localhost systemd[1]: Finished Load Kernel Module fuse.
Feb 19 18:28:55 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Feb 19 18:28:55 localhost systemd[1]: Finished Generate network units from Kernel command line.
Feb 19 18:28:55 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Feb 19 18:28:55 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 19 18:28:55 localhost systemd[1]: Mounting FUSE Control File System...
Feb 19 18:28:55 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 19 18:28:55 localhost systemd[1]: Starting Rebuild Hardware Database...
Feb 19 18:28:55 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Feb 19 18:28:55 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Feb 19 18:28:55 localhost systemd[1]: Starting Load/Save OS Random Seed...
Feb 19 18:28:55 localhost systemd[1]: Starting Create System Users...
Feb 19 18:28:55 localhost systemd[1]: Mounted FUSE Control File System.
Feb 19 18:28:55 localhost systemd-journald[696]: Runtime Journal (/run/log/journal/621707288f5710e1387f73ed6d90e964) is 8.0M, max 153.6M, 145.6M free.
Feb 19 18:28:55 localhost systemd-journald[696]: Received client request to flush runtime journal.
Feb 19 18:28:55 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 19 18:28:55 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Feb 19 18:28:55 localhost systemd[1]: Finished Load/Save OS Random Seed.
Feb 19 18:28:55 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 19 18:28:55 localhost systemd[1]: Finished Create System Users.
Feb 19 18:28:55 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 19 18:28:55 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 19 18:28:55 localhost systemd[1]: Reached target Preparation for Local File Systems.
Feb 19 18:28:55 localhost systemd[1]: Reached target Local File Systems.
Feb 19 18:28:55 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Feb 19 18:28:55 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Feb 19 18:28:55 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Feb 19 18:28:55 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Feb 19 18:28:55 localhost systemd[1]: Starting Automatic Boot Loader Update...
Feb 19 18:28:55 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Feb 19 18:28:55 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 19 18:28:55 localhost bootctl[715]: Couldn't find EFI system partition, skipping.
Feb 19 18:28:55 localhost systemd[1]: Finished Automatic Boot Loader Update.
Feb 19 18:28:55 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 19 18:28:55 localhost systemd[1]: Starting Security Auditing Service...
Feb 19 18:28:55 localhost systemd[1]: Starting RPC Bind...
Feb 19 18:28:55 localhost systemd[1]: Starting Rebuild Journal Catalog...
Feb 19 18:28:55 localhost auditd[721]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Feb 19 18:28:55 localhost auditd[721]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Feb 19 18:28:55 localhost systemd[1]: Finished Rebuild Journal Catalog.
Feb 19 18:28:55 localhost systemd[1]: Started RPC Bind.
Feb 19 18:28:55 localhost augenrules[726]: /sbin/augenrules: No change
Feb 19 18:28:55 localhost augenrules[741]: No rules
Feb 19 18:28:55 localhost augenrules[741]: enabled 1
Feb 19 18:28:55 localhost augenrules[741]: failure 1
Feb 19 18:28:55 localhost augenrules[741]: pid 721
Feb 19 18:28:55 localhost augenrules[741]: rate_limit 0
Feb 19 18:28:55 localhost augenrules[741]: backlog_limit 8192
Feb 19 18:28:55 localhost augenrules[741]: lost 0
Feb 19 18:28:55 localhost augenrules[741]: backlog 3
Feb 19 18:28:55 localhost augenrules[741]: backlog_wait_time 60000
Feb 19 18:28:55 localhost augenrules[741]: backlog_wait_time_actual 0
Feb 19 18:28:55 localhost augenrules[741]: enabled 1
Feb 19 18:28:55 localhost augenrules[741]: failure 1
Feb 19 18:28:55 localhost augenrules[741]: pid 721
Feb 19 18:28:55 localhost augenrules[741]: rate_limit 0
Feb 19 18:28:55 localhost augenrules[741]: backlog_limit 8192
Feb 19 18:28:55 localhost augenrules[741]: lost 0
Feb 19 18:28:55 localhost augenrules[741]: backlog 3
Feb 19 18:28:55 localhost augenrules[741]: backlog_wait_time 60000
Feb 19 18:28:55 localhost augenrules[741]: backlog_wait_time_actual 0
Feb 19 18:28:55 localhost augenrules[741]: enabled 1
Feb 19 18:28:55 localhost augenrules[741]: failure 1
Feb 19 18:28:55 localhost augenrules[741]: pid 721
Feb 19 18:28:55 localhost augenrules[741]: rate_limit 0
Feb 19 18:28:55 localhost augenrules[741]: backlog_limit 8192
Feb 19 18:28:55 localhost augenrules[741]: lost 0
Feb 19 18:28:55 localhost augenrules[741]: backlog 0
Feb 19 18:28:55 localhost augenrules[741]: backlog_wait_time 60000
Feb 19 18:28:55 localhost augenrules[741]: backlog_wait_time_actual 0
Feb 19 18:28:55 localhost systemd[1]: Started Security Auditing Service.
Feb 19 18:28:55 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Feb 19 18:28:55 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Feb 19 18:28:56 localhost systemd[1]: Finished Rebuild Hardware Database.
Feb 19 18:28:56 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 19 18:28:56 localhost systemd-udevd[749]: Using default interface naming scheme 'rhel-9.0'.
Feb 19 18:28:56 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Feb 19 18:28:56 localhost systemd[1]: Starting Update is Completed...
Feb 19 18:28:56 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 19 18:28:56 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 19 18:28:56 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 19 18:28:56 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 19 18:28:56 localhost systemd[1]: Finished Update is Completed.
Feb 19 18:28:56 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Feb 19 18:28:56 localhost systemd[1]: Reached target System Initialization.
Feb 19 18:28:56 localhost systemd[1]: Started dnf makecache --timer.
Feb 19 18:28:56 localhost systemd-udevd[754]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 18:28:56 localhost systemd[1]: Started Daily rotation of log files.
Feb 19 18:28:56 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Feb 19 18:28:56 localhost systemd[1]: Reached target Timer Units.
Feb 19 18:28:56 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 19 18:28:56 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Feb 19 18:28:56 localhost systemd[1]: Reached target Socket Units.
Feb 19 18:28:56 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Feb 19 18:28:56 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Feb 19 18:28:56 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Feb 19 18:28:56 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Feb 19 18:28:56 localhost systemd[1]: Starting D-Bus System Message Bus...
Feb 19 18:28:56 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 19 18:28:56 localhost systemd[1]: Started D-Bus System Message Bus.
Feb 19 18:28:56 localhost systemd[1]: Reached target Basic System.
Feb 19 18:28:56 localhost dbus-broker-lau[794]: Ready
Feb 19 18:28:56 localhost systemd[1]: Starting NTP client/server...
Feb 19 18:28:56 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Feb 19 18:28:56 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Feb 19 18:28:56 localhost systemd[1]: Starting IPv4 firewall with iptables...
Feb 19 18:28:56 localhost systemd[1]: Started irqbalance daemon.
Feb 19 18:28:56 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Feb 19 18:28:56 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 19 18:28:56 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 19 18:28:56 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 19 18:28:56 localhost systemd[1]: Reached target sshd-keygen.target.
Feb 19 18:28:56 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Feb 19 18:28:56 localhost systemd[1]: Reached target User and Group Name Lookups.
Feb 19 18:28:56 localhost systemd[1]: Starting User Login Management...
Feb 19 18:28:56 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Feb 19 18:28:56 localhost kernel: kvm_amd: TSC scaling supported
Feb 19 18:28:56 localhost kernel: kvm_amd: Nested Virtualization enabled
Feb 19 18:28:56 localhost kernel: kvm_amd: Nested Paging enabled
Feb 19 18:28:56 localhost kernel: kvm_amd: LBR virtualization supported
Feb 19 18:28:56 localhost chronyd[830]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 19 18:28:56 localhost chronyd[830]: Loaded 0 symmetric keys
Feb 19 18:28:56 localhost chronyd[830]: Using right/UTC timezone to obtain leap second data
Feb 19 18:28:56 localhost chronyd[830]: Loaded seccomp filter (level 2)
Feb 19 18:28:56 localhost systemd[1]: Started NTP client/server.
Feb 19 18:28:56 localhost systemd-logind[822]: New seat seat0.
Feb 19 18:28:56 localhost systemd-logind[822]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 19 18:28:56 localhost systemd-logind[822]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 19 18:28:56 localhost systemd[1]: Started User Login Management.
Feb 19 18:28:56 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Feb 19 18:28:56 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Feb 19 18:28:56 localhost iptables.init[814]: iptables: Applying firewall rules: [  OK  ]
Feb 19 18:28:56 localhost systemd[1]: Finished IPv4 firewall with iptables.
Feb 19 18:28:57 localhost cloud-init[855]: Cloud-init v. 24.4-8.el9 running 'init-local' at Thu, 19 Feb 2026 18:28:57 +0000. Up 6.74 seconds.
Feb 19 18:28:57 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Feb 19 18:28:57 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Feb 19 18:28:57 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpzb61v6be.mount: Deactivated successfully.
Feb 19 18:28:57 localhost systemd[1]: Starting Hostname Service...
Feb 19 18:28:57 localhost systemd[1]: Started Hostname Service.
Feb 19 18:28:57 np0005624716.novalocal systemd-hostnamed[869]: Hostname set to <np0005624716.novalocal> (static)
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Reached target Preparation for Network.
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Starting Network Manager...
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7055] NetworkManager (version 1.54.3-2.el9) is starting... (boot:ff499d4a-6be1-401f-8b91-3dc243d525b1)
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7061] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7200] manager[0x557931292000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7259] hostname: hostname: using hostnamed
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7259] hostname: static hostname changed from (none) to "np0005624716.novalocal"
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7263] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7396] manager[0x557931292000]: rfkill: Wi-Fi hardware radio set enabled
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7397] manager[0x557931292000]: rfkill: WWAN hardware radio set enabled
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7476] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7476] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7477] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7477] manager: Networking is enabled by state file
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7478] settings: Loaded settings plugin: keyfile (internal)
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7509] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7532] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7545] dhcp: init: Using DHCP client 'internal'
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7547] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7558] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7569] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7580] device (lo): Activation: starting connection 'lo' (4563c9fb-fd0a-4471-af90-84b9f0d73c08)
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7587] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7590] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7619] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7624] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7626] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7628] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7631] device (eth0): carrier: link connected
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7634] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7639] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7644] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7647] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7648] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7650] manager: NetworkManager state is now CONNECTING
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7651] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7656] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7659] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7698] dhcp4 (eth0): state changed new lease, address=38.102.83.219
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7704] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7720] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Started Network Manager.
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Reached target Network.
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7968] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.7978] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.8003] device (lo): Activation: successful, device activated.
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.8017] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.8041] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.8051] manager: NetworkManager state is now CONNECTED_SITE
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.8060] device (eth0): Activation: successful, device activated.
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.8071] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 19 18:28:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525737.8081] manager: startup complete
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Reached target NFS client services.
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Reached target Remote File Systems.
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 19 18:28:57 np0005624716.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: Cloud-init v. 24.4-8.el9 running 'init' at Thu, 19 Feb 2026 18:28:58 +0000. Up 7.75 seconds.
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: |  eth0  | True |        38.102.83.219         | 255.255.255.0 | global | fa:16:3e:a2:5d:ce |
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: |  eth0  | True | fe80::f816:3eff:fea2:5dce/64 |       .       |  link  | fa:16:3e:a2:5d:ce |
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Feb 19 18:28:58 np0005624716.novalocal cloud-init[936]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 19 18:28:59 np0005624716.novalocal useradd[1002]: new group: name=cloud-user, GID=1001
Feb 19 18:28:59 np0005624716.novalocal useradd[1002]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Feb 19 18:28:59 np0005624716.novalocal useradd[1002]: add 'cloud-user' to group 'adm'
Feb 19 18:28:59 np0005624716.novalocal useradd[1002]: add 'cloud-user' to group 'systemd-journal'
Feb 19 18:28:59 np0005624716.novalocal useradd[1002]: add 'cloud-user' to shadow group 'adm'
Feb 19 18:28:59 np0005624716.novalocal useradd[1002]: add 'cloud-user' to shadow group 'systemd-journal'
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: Generating public/private rsa key pair.
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: The key fingerprint is:
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: SHA256:u1wBACeC0S5NbyueM8ucnNJ5LA35JzswHAYyXa/tDDk root@np0005624716.novalocal
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: The key's randomart image is:
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: +---[RSA 3072]----+
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |.=..+.o          |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |= +. + .         |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |.* .  . .        |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |. = o+   .       |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: | + +E.. S .      |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |  B .=   . .     |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: | o @  o . .      |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |.+OoB .. o       |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: | .B*.=  o        |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: +----[SHA256]-----+
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: Generating public/private ecdsa key pair.
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: The key fingerprint is:
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: SHA256:gLD+yCCW55myNovggPGXkLsYTlIN7SZvZZLMoIgGda8 root@np0005624716.novalocal
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: The key's randomart image is:
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: +---[ECDSA 256]---+
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |  o .            |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: | . = o           |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |. + o o          |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |+o.O o .         |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |B+*.E o S        |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |=*oOo=           |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |*o=+*            |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |O=o+             |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |+=+              |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: +----[SHA256]-----+
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: Generating public/private ed25519 key pair.
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: The key fingerprint is:
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: SHA256:oJ+/cIRmD8yJ9lAKD+PBCx7b0IAiRaiNh1N00/OhDXo root@np0005624716.novalocal
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: The key's randomart image is:
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: +--[ED25519 256]--+
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |.*+ o.           |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |=.+. .+ .        |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |=B*. .o* .       |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |*oBB.BE+o        |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: | =o.B.O S        |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |   . * =         |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |      = o        |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |       +         |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: |        o.       |
Feb 19 18:28:59 np0005624716.novalocal cloud-init[936]: +----[SHA256]-----+
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Reached target Cloud-config availability.
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Reached target Network is Online.
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Starting Crash recovery kernel arming...
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Starting System Logging Service...
Feb 19 18:28:59 np0005624716.novalocal sm-notify[1018]: Version 2.5.4 starting
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Starting OpenSSH server daemon...
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Starting Permit User Sessions...
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Started Notify NFS peers of a restart.
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Finished Permit User Sessions.
Feb 19 18:28:59 np0005624716.novalocal sshd[1020]: Server listening on 0.0.0.0 port 22.
Feb 19 18:28:59 np0005624716.novalocal sshd[1020]: Server listening on :: port 22.
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Started OpenSSH server daemon.
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Started Command Scheduler.
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Started Getty on tty1.
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Started Serial Getty on ttyS0.
Feb 19 18:28:59 np0005624716.novalocal crond[1023]: (CRON) STARTUP (1.5.7)
Feb 19 18:28:59 np0005624716.novalocal crond[1023]: (CRON) INFO (Syslog will be used instead of sendmail.)
Feb 19 18:28:59 np0005624716.novalocal crond[1023]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 8% if used.)
Feb 19 18:28:59 np0005624716.novalocal crond[1023]: (CRON) INFO (running with inotify support)
Feb 19 18:28:59 np0005624716.novalocal rsyslogd[1019]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1019" x-info="https://www.rsyslog.com"] start
Feb 19 18:28:59 np0005624716.novalocal rsyslogd[1019]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Reached target Login Prompts.
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Started System Logging Service.
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Reached target Multi-User System.
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Feb 19 18:28:59 np0005624716.novalocal rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 18:28:59 np0005624716.novalocal cloud-init[1146]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Thu, 19 Feb 2026 18:28:59 +0000. Up 9.38 seconds.
Feb 19 18:28:59 np0005624716.novalocal kdumpctl[1032]: kdump: No kdump initial ramdisk found.
Feb 19 18:28:59 np0005624716.novalocal kdumpctl[1032]: kdump: Rebuilding /boot/initramfs-5.14.0-681.el9.x86_64kdump.img
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Feb 19 18:28:59 np0005624716.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Feb 19 18:29:00 np0005624716.novalocal cloud-init[1408]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Thu, 19 Feb 2026 18:29:00 +0000. Up 9.76 seconds.
Feb 19 18:29:00 np0005624716.novalocal cloud-init[1443]: #############################################################
Feb 19 18:29:00 np0005624716.novalocal cloud-init[1449]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Feb 19 18:29:00 np0005624716.novalocal cloud-init[1457]: 256 SHA256:gLD+yCCW55myNovggPGXkLsYTlIN7SZvZZLMoIgGda8 root@np0005624716.novalocal (ECDSA)
Feb 19 18:29:00 np0005624716.novalocal cloud-init[1462]: 256 SHA256:oJ+/cIRmD8yJ9lAKD+PBCx7b0IAiRaiNh1N00/OhDXo root@np0005624716.novalocal (ED25519)
Feb 19 18:29:00 np0005624716.novalocal cloud-init[1466]: 3072 SHA256:u1wBACeC0S5NbyueM8ucnNJ5LA35JzswHAYyXa/tDDk root@np0005624716.novalocal (RSA)
Feb 19 18:29:00 np0005624716.novalocal cloud-init[1468]: -----END SSH HOST KEY FINGERPRINTS-----
Feb 19 18:29:00 np0005624716.novalocal cloud-init[1469]: #############################################################
Feb 19 18:29:00 np0005624716.novalocal cloud-init[1408]: Cloud-init v. 24.4-8.el9 finished at Thu, 19 Feb 2026 18:29:00 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.93 seconds
Feb 19 18:29:00 np0005624716.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Feb 19 18:29:00 np0005624716.novalocal systemd[1]: Reached target Cloud-init target.
Feb 19 18:29:00 np0005624716.novalocal dracut[1524]: dracut-057-110.git20260130.el9
Feb 19 18:29:00 np0005624716.novalocal dracut[1526]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/9d578f93-c4e9-4172-8459-ef150e54751c /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-681.el9.x86_64kdump.img 5.14.0-681.el9.x86_64
Feb 19 18:29:00 np0005624716.novalocal sshd-session[1549]: Connection reset by 38.102.83.114 port 37916 [preauth]
Feb 19 18:29:00 np0005624716.novalocal sshd-session[1569]: Unable to negotiate with 38.102.83.114 port 37922: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Feb 19 18:29:00 np0005624716.novalocal sshd-session[1585]: Connection reset by 38.102.83.114 port 37934 [preauth]
Feb 19 18:29:00 np0005624716.novalocal sshd-session[1598]: Unable to negotiate with 38.102.83.114 port 37948: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Feb 19 18:29:00 np0005624716.novalocal sshd-session[1600]: Unable to negotiate with 38.102.83.114 port 37964: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Feb 19 18:29:00 np0005624716.novalocal sshd-session[1602]: Connection reset by 38.102.83.114 port 37980 [preauth]
Feb 19 18:29:00 np0005624716.novalocal sshd-session[1604]: Connection reset by 38.102.83.114 port 37982 [preauth]
Feb 19 18:29:00 np0005624716.novalocal sshd-session[1606]: Unable to negotiate with 38.102.83.114 port 37990: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Feb 19 18:29:00 np0005624716.novalocal sshd-session[1611]: Unable to negotiate with 38.102.83.114 port 37992: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: Module 'resume' will not be installed, because it's in the list to be omitted!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: memstrack is not available
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 19 18:29:01 np0005624716.novalocal dracut[1526]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: memstrack is not available
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: *** Including module: systemd ***
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: *** Including module: fips ***
Feb 19 18:29:02 np0005624716.novalocal chronyd[830]: Selected source 209.227.173.244 (2.centos.pool.ntp.org)
Feb 19 18:29:02 np0005624716.novalocal chronyd[830]: System clock TAI offset set to 37 seconds
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: *** Including module: systemd-initrd ***
Feb 19 18:29:02 np0005624716.novalocal dracut[1526]: *** Including module: i18n ***
Feb 19 18:29:03 np0005624716.novalocal dracut[1526]: *** Including module: drm ***
Feb 19 18:29:03 np0005624716.novalocal dracut[1526]: *** Including module: prefixdevname ***
Feb 19 18:29:03 np0005624716.novalocal dracut[1526]: *** Including module: kernel-modules ***
Feb 19 18:29:03 np0005624716.novalocal kernel: block vda: the capability attribute has been deprecated.
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]: *** Including module: kernel-modules-extra ***
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]: *** Including module: qemu ***
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]: *** Including module: fstab-sys ***
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]: *** Including module: rootfs-block ***
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]: *** Including module: terminfo ***
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]: *** Including module: udev-rules ***
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]: Skipping udev rule: 91-permissions.rules
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]: Skipping udev rule: 80-drivers-modprobe.rules
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]: *** Including module: virtiofs ***
Feb 19 18:29:04 np0005624716.novalocal dracut[1526]: *** Including module: dracut-systemd ***
Feb 19 18:29:05 np0005624716.novalocal dracut[1526]: *** Including module: usrmount ***
Feb 19 18:29:05 np0005624716.novalocal dracut[1526]: *** Including module: base ***
Feb 19 18:29:05 np0005624716.novalocal dracut[1526]: *** Including module: fs-lib ***
Feb 19 18:29:05 np0005624716.novalocal dracut[1526]: *** Including module: kdumpbase ***
Feb 19 18:29:05 np0005624716.novalocal dracut[1526]: *** Including module: microcode_ctl-fw_dir_override ***
Feb 19 18:29:05 np0005624716.novalocal dracut[1526]:   microcode_ctl module: mangling fw_dir
Feb 19 18:29:05 np0005624716.novalocal dracut[1526]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Feb 19 18:29:05 np0005624716.novalocal dracut[1526]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Feb 19 18:29:05 np0005624716.novalocal dracut[1526]:     microcode_ctl: configuration "intel" is ignored
Feb 19 18:29:05 np0005624716.novalocal dracut[1526]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: Cannot change IRQ 35 affinity: Operation not permitted
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: IRQ 35 affinity is now unmanaged
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: Cannot change IRQ 25 affinity: Operation not permitted
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: IRQ 25 affinity is now unmanaged
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: Cannot change IRQ 33 affinity: Operation not permitted
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: IRQ 33 affinity is now unmanaged
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: Cannot change IRQ 31 affinity: Operation not permitted
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: IRQ 31 affinity is now unmanaged
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: Cannot change IRQ 26 affinity: Operation not permitted
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: IRQ 26 affinity is now unmanaged
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: Cannot change IRQ 34 affinity: Operation not permitted
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: IRQ 34 affinity is now unmanaged
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: Cannot change IRQ 32 affinity: Operation not permitted
Feb 19 18:29:06 np0005624716.novalocal irqbalance[817]: IRQ 32 affinity is now unmanaged
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]: *** Including module: openssl ***
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]: *** Including module: shutdown ***
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]: *** Including module: squash ***
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]: *** Including modules done ***
Feb 19 18:29:06 np0005624716.novalocal dracut[1526]: *** Installing kernel module dependencies ***
Feb 19 18:29:07 np0005624716.novalocal dracut[1526]: *** Installing kernel module dependencies done ***
Feb 19 18:29:07 np0005624716.novalocal dracut[1526]: *** Resolving executable dependencies ***
Feb 19 18:29:07 np0005624716.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 19 18:29:08 np0005624716.novalocal dracut[1526]: *** Resolving executable dependencies done ***
Feb 19 18:29:08 np0005624716.novalocal dracut[1526]: *** Generating early-microcode cpio image ***
Feb 19 18:29:08 np0005624716.novalocal dracut[1526]: *** Store current command line parameters ***
Feb 19 18:29:08 np0005624716.novalocal dracut[1526]: Stored kernel commandline:
Feb 19 18:29:08 np0005624716.novalocal dracut[1526]: No dracut internal kernel commandline stored in the initramfs
Feb 19 18:29:09 np0005624716.novalocal dracut[1526]: *** Install squash loader ***
Feb 19 18:29:09 np0005624716.novalocal dracut[1526]: *** Squashing the files inside the initramfs ***
Feb 19 18:29:11 np0005624716.novalocal dracut[1526]: *** Squashing the files inside the initramfs done ***
Feb 19 18:29:11 np0005624716.novalocal dracut[1526]: *** Creating image file '/boot/initramfs-5.14.0-681.el9.x86_64kdump.img' ***
Feb 19 18:29:11 np0005624716.novalocal dracut[1526]: *** Hardlinking files ***
Feb 19 18:29:11 np0005624716.novalocal dracut[1526]: Mode:           real
Feb 19 18:29:11 np0005624716.novalocal dracut[1526]: Files:          50
Feb 19 18:29:11 np0005624716.novalocal dracut[1526]: Linked:         0 files
Feb 19 18:29:11 np0005624716.novalocal dracut[1526]: Compared:       0 xattrs
Feb 19 18:29:11 np0005624716.novalocal dracut[1526]: Compared:       0 files
Feb 19 18:29:11 np0005624716.novalocal dracut[1526]: Saved:          0 B
Feb 19 18:29:11 np0005624716.novalocal dracut[1526]: Duration:       0.000458 seconds
Feb 19 18:29:11 np0005624716.novalocal dracut[1526]: *** Hardlinking files done ***
Feb 19 18:29:11 np0005624716.novalocal dracut[1526]: *** Creating initramfs image file '/boot/initramfs-5.14.0-681.el9.x86_64kdump.img' done ***
Feb 19 18:29:12 np0005624716.novalocal kdumpctl[1032]: kdump: kexec: loaded kdump kernel
Feb 19 18:29:12 np0005624716.novalocal kdumpctl[1032]: kdump: Starting kdump: [OK]
Feb 19 18:29:12 np0005624716.novalocal systemd[1]: Finished Crash recovery kernel arming.
Feb 19 18:29:12 np0005624716.novalocal systemd[1]: Startup finished in 1.567s (kernel) + 2.702s (initrd) + 17.356s (userspace) = 21.626s.
Feb 19 18:29:19 np0005624716.novalocal sshd-session[4800]: Accepted publickey for zuul from 38.102.83.114 port 58620 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Feb 19 18:29:19 np0005624716.novalocal systemd-logind[822]: New session 1 of user zuul.
Feb 19 18:29:19 np0005624716.novalocal systemd[1]: Created slice User Slice of UID 1000.
Feb 19 18:29:19 np0005624716.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Feb 19 18:29:19 np0005624716.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Feb 19 18:29:19 np0005624716.novalocal systemd[1]: Starting User Manager for UID 1000...
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Queued start job for default target Main User Target.
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Created slice User Application Slice.
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Started Daily Cleanup of User's Temporary Directories.
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Reached target Paths.
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Reached target Timers.
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Starting D-Bus User Message Bus Socket...
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Starting Create User's Volatile Files and Directories...
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Listening on D-Bus User Message Bus Socket.
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Reached target Sockets.
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Finished Create User's Volatile Files and Directories.
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Reached target Basic System.
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Reached target Main User Target.
Feb 19 18:29:19 np0005624716.novalocal systemd[4804]: Startup finished in 134ms.
Feb 19 18:29:19 np0005624716.novalocal systemd[1]: Started User Manager for UID 1000.
Feb 19 18:29:19 np0005624716.novalocal systemd[1]: Started Session 1 of User zuul.
Feb 19 18:29:19 np0005624716.novalocal sshd-session[4800]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 18:29:19 np0005624716.novalocal python3[4886]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 18:29:22 np0005624716.novalocal python3[4914]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 18:29:27 np0005624716.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 19 18:29:29 np0005624716.novalocal python3[4974]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 18:29:30 np0005624716.novalocal python3[5014]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Feb 19 18:29:32 np0005624716.novalocal python3[5040]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDWIxuVi6eMKOpitzeMBvFptIcZbMBDlsjsNVazRudqsoHdvFf8t4FwmIYeHLtGxfohtV1gmO6W8LjV0XCcv5/xj9Fvlx2iur1ACAEZcPrD/XYmaSPWos9QN55Zv/537/F+qxGsdEcGGj8g0NJo+jgxpvRTCkhW1wTybfdYpVNHQpoXb405JhzhySFmyYja9HM/SZlBSVIzd6OxicJdgNQWvLHOldoKajQVnyKo5UeL0wSWeERbngwJRCBgHRzRJsTwdg1RbFwLmwnoActEbp1MvugJsIObZWHZ6c13BCxVBJFRUljTEvDBU1jcxnr5AMkhLgTiKLd54hqXCrFFfCoFitY3rC4GwbPCnXdxZgwUTBp+UtmlQSL7C+Fqgm5tVHpGrnEpc6zGyE+7EJWd0dY+COxEXPGrpeiq/si6v6OZLHRZjpgSR7ug83hkOOVqpCrN2KOOI8nECZ8PyuKarnCGRJr9/nRUQiSubeiFnhsYsr+5kGba26SVSTq1ePAABQ0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:33 np0005624716.novalocal python3[5064]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:33 np0005624716.novalocal python3[5163]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:29:33 np0005624716.novalocal python3[5234]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771525773.2953107-229-278700927827379/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=71b93b94ebbb4542ad888eccd7db5743_id_rsa follow=False checksum=dde44297f81348653ed29f602548239c28f0e430 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:34 np0005624716.novalocal python3[5357]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:29:34 np0005624716.novalocal python3[5428]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771525774.2681625-273-233731785307710/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=71b93b94ebbb4542ad888eccd7db5743_id_rsa.pub follow=False checksum=5ddafbde73e39575202c87aa6eb8d065057f6c98 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:36 np0005624716.novalocal python3[5476]: ansible-ping Invoked with data=pong
Feb 19 18:29:37 np0005624716.novalocal python3[5500]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 18:29:39 np0005624716.novalocal python3[5558]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Feb 19 18:29:40 np0005624716.novalocal python3[5590]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:40 np0005624716.novalocal python3[5614]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:41 np0005624716.novalocal python3[5638]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:41 np0005624716.novalocal python3[5662]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:41 np0005624716.novalocal python3[5686]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:42 np0005624716.novalocal python3[5710]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:43 np0005624716.novalocal sudo[5734]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opbwzjdqejxknsnaepbfquhhucozimbg ; /usr/bin/python3'
Feb 19 18:29:43 np0005624716.novalocal sudo[5734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:29:43 np0005624716.novalocal python3[5736]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:43 np0005624716.novalocal sudo[5734]: pam_unix(sudo:session): session closed for user root
Feb 19 18:29:44 np0005624716.novalocal sudo[5812]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewmdwotebgzopukdsafxodpivclhjmnh ; /usr/bin/python3'
Feb 19 18:29:44 np0005624716.novalocal sudo[5812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:29:44 np0005624716.novalocal python3[5814]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:29:44 np0005624716.novalocal sudo[5812]: pam_unix(sudo:session): session closed for user root
Feb 19 18:29:44 np0005624716.novalocal sudo[5885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwxntpxyvtnavuxhwslwhlvyxhlnblxv ; /usr/bin/python3'
Feb 19 18:29:44 np0005624716.novalocal sudo[5885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:29:45 np0005624716.novalocal python3[5887]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771525784.0101154-26-40021542775021/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:45 np0005624716.novalocal sudo[5885]: pam_unix(sudo:session): session closed for user root
Feb 19 18:29:45 np0005624716.novalocal python3[5935]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:45 np0005624716.novalocal python3[5959]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:46 np0005624716.novalocal python3[5983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:46 np0005624716.novalocal python3[6007]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:46 np0005624716.novalocal python3[6031]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:47 np0005624716.novalocal python3[6055]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:47 np0005624716.novalocal python3[6079]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:47 np0005624716.novalocal python3[6103]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:47 np0005624716.novalocal python3[6127]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:48 np0005624716.novalocal python3[6151]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:48 np0005624716.novalocal python3[6175]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:48 np0005624716.novalocal python3[6199]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:48 np0005624716.novalocal python3[6223]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICWBreHW95Wz2Toz5YwCGQwFcUG8oFYkienDh9tntmDc ralfieri@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:49 np0005624716.novalocal python3[6247]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:49 np0005624716.novalocal python3[6271]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:49 np0005624716.novalocal python3[6295]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:50 np0005624716.novalocal python3[6319]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:50 np0005624716.novalocal python3[6343]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:50 np0005624716.novalocal python3[6367]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:50 np0005624716.novalocal python3[6391]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:51 np0005624716.novalocal python3[6415]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:51 np0005624716.novalocal python3[6439]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:51 np0005624716.novalocal python3[6463]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:51 np0005624716.novalocal python3[6487]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:52 np0005624716.novalocal python3[6511]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:52 np0005624716.novalocal python3[6535]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:29:55 np0005624716.novalocal sudo[6559]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umpbjrqeohokaljhxoxlewiicanufjoh ; /usr/bin/python3'
Feb 19 18:29:55 np0005624716.novalocal sudo[6559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:29:55 np0005624716.novalocal python3[6561]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 19 18:29:55 np0005624716.novalocal systemd[1]: Starting Time & Date Service...
Feb 19 18:29:55 np0005624716.novalocal systemd[1]: Started Time & Date Service.
Feb 19 18:29:55 np0005624716.novalocal systemd-timedated[6563]: Changed time zone to 'UTC' (UTC).
Feb 19 18:29:55 np0005624716.novalocal sudo[6559]: pam_unix(sudo:session): session closed for user root
Feb 19 18:29:56 np0005624716.novalocal sudo[6590]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdxwuucmewelfaxcszneinyygljyfrzk ; /usr/bin/python3'
Feb 19 18:29:56 np0005624716.novalocal sudo[6590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:29:56 np0005624716.novalocal python3[6592]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:56 np0005624716.novalocal sudo[6590]: pam_unix(sudo:session): session closed for user root
Feb 19 18:29:56 np0005624716.novalocal python3[6668]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:29:57 np0005624716.novalocal python3[6739]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1771525796.4887106-202-165646065167960/source _original_basename=tmp_3rlf9di follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:57 np0005624716.novalocal python3[6839]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:29:58 np0005624716.novalocal python3[6910]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771525797.4733024-242-233656853674956/source _original_basename=tmp98d6xapo follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:58 np0005624716.novalocal sudo[7010]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuxwmmkxrumxmsdibtciwvbpevnqnwwk ; /usr/bin/python3'
Feb 19 18:29:58 np0005624716.novalocal sudo[7010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:29:58 np0005624716.novalocal python3[7012]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:29:58 np0005624716.novalocal sudo[7010]: pam_unix(sudo:session): session closed for user root
Feb 19 18:29:59 np0005624716.novalocal sudo[7083]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avpgvwgafqpcglvwievpkmrgslwldtqs ; /usr/bin/python3'
Feb 19 18:29:59 np0005624716.novalocal sudo[7083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:29:59 np0005624716.novalocal python3[7085]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771525798.6928988-306-73285700730175/source _original_basename=tmp6k2y5jdh follow=False checksum=873438299bb17ff1128a56bbeb324b7beaf57647 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:29:59 np0005624716.novalocal sudo[7083]: pam_unix(sudo:session): session closed for user root
Feb 19 18:29:59 np0005624716.novalocal python3[7133]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:30:00 np0005624716.novalocal python3[7159]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:30:00 np0005624716.novalocal sudo[7237]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jibnechcceblubvxqrtzusrzoyxrceld ; /usr/bin/python3'
Feb 19 18:30:00 np0005624716.novalocal sudo[7237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:30:00 np0005624716.novalocal python3[7239]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:30:00 np0005624716.novalocal sudo[7237]: pam_unix(sudo:session): session closed for user root
Feb 19 18:30:00 np0005624716.novalocal sudo[7310]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txeyfdcllbumkmkxzldecjjgtsbzbxur ; /usr/bin/python3'
Feb 19 18:30:00 np0005624716.novalocal sudo[7310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:30:00 np0005624716.novalocal python3[7312]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1771525800.372458-362-250513639016876/source _original_basename=tmpjuz8kg5v follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:30:01 np0005624716.novalocal sudo[7310]: pam_unix(sudo:session): session closed for user root
Feb 19 18:30:01 np0005624716.novalocal sudo[7361]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljehaqzqikewpkyatwgfiqlsjaezjlsa ; /usr/bin/python3'
Feb 19 18:30:01 np0005624716.novalocal sudo[7361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:30:01 np0005624716.novalocal python3[7363]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-b0ad-f0f7-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:30:01 np0005624716.novalocal sudo[7361]: pam_unix(sudo:session): session closed for user root
Feb 19 18:30:02 np0005624716.novalocal python3[7391]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-b0ad-f0f7-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Feb 19 18:30:03 np0005624716.novalocal python3[7420]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:30:21 np0005624716.novalocal sshd-session[7421]: Connection closed by 165.22.221.82 port 52644
Feb 19 18:30:22 np0005624716.novalocal sudo[7445]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvzqzqvdoczsriuqqjdypevauejznsab ; /usr/bin/python3'
Feb 19 18:30:22 np0005624716.novalocal sudo[7445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:30:22 np0005624716.novalocal python3[7447]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:30:22 np0005624716.novalocal sudo[7445]: pam_unix(sudo:session): session closed for user root
Feb 19 18:30:25 np0005624716.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 19 18:30:57 np0005624716.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 19 18:30:57 np0005624716.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Feb 19 18:30:57 np0005624716.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Feb 19 18:30:57 np0005624716.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Feb 19 18:30:57 np0005624716.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Feb 19 18:30:57 np0005624716.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Feb 19 18:30:57 np0005624716.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Feb 19 18:30:57 np0005624716.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Feb 19 18:30:57 np0005624716.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Feb 19 18:30:57 np0005624716.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Feb 19 18:30:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525857.6805] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 19 18:30:57 np0005624716.novalocal systemd-udevd[7450]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 18:30:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525857.6972] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 19 18:30:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525857.7004] settings: (eth1): created default wired connection 'Wired connection 1'
Feb 19 18:30:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525857.7010] device (eth1): carrier: link connected
Feb 19 18:30:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525857.7013] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 19 18:30:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525857.7020] policy: auto-activating connection 'Wired connection 1' (1cf3a24b-f7c9-3493-9032-4c304d6d9dc5)
Feb 19 18:30:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525857.7025] device (eth1): Activation: starting connection 'Wired connection 1' (1cf3a24b-f7c9-3493-9032-4c304d6d9dc5)
Feb 19 18:30:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525857.7026] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 18:30:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525857.7031] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 18:30:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525857.7036] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 18:30:57 np0005624716.novalocal NetworkManager[873]: <info>  [1771525857.7041] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 19 18:30:58 np0005624716.novalocal python3[7477]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-498a-2b1e-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:31:05 np0005624716.novalocal sudo[7555]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfsqtbnpdyslklkvudumncppkehsckoe ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 19 18:31:05 np0005624716.novalocal sudo[7555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:31:05 np0005624716.novalocal python3[7557]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:31:05 np0005624716.novalocal sudo[7555]: pam_unix(sudo:session): session closed for user root
Feb 19 18:31:05 np0005624716.novalocal sudo[7628]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvrnjmxyjjusscxkyclivvirazveyemg ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 19 18:31:05 np0005624716.novalocal sudo[7628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:31:05 np0005624716.novalocal python3[7630]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771525865.1306376-103-276797386494605/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=8df6033f7dbe81e2b4c27334b6dfc5f0ac83edce backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:31:05 np0005624716.novalocal sudo[7628]: pam_unix(sudo:session): session closed for user root
Feb 19 18:31:06 np0005624716.novalocal sudo[7678]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtxrdaravegitvwlhluxmtgetdansfum ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 19 18:31:06 np0005624716.novalocal sudo[7678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:31:06 np0005624716.novalocal python3[7680]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 18:31:06 np0005624716.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 19 18:31:06 np0005624716.novalocal systemd[1]: Stopped Network Manager Wait Online.
Feb 19 18:31:06 np0005624716.novalocal systemd[1]: Stopping Network Manager Wait Online...
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[873]: <info>  [1771525866.7573] caught SIGTERM, shutting down normally.
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[873]: <info>  [1771525866.7582] dhcp4 (eth0): canceled DHCP transaction
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[873]: <info>  [1771525866.7583] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[873]: <info>  [1771525866.7583] dhcp4 (eth0): state changed no lease
Feb 19 18:31:06 np0005624716.novalocal systemd[1]: Stopping Network Manager...
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[873]: <info>  [1771525866.7587] manager: NetworkManager state is now CONNECTING
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[873]: <info>  [1771525866.7717] dhcp4 (eth1): canceled DHCP transaction
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[873]: <info>  [1771525866.7718] dhcp4 (eth1): state changed no lease
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[873]: <info>  [1771525866.7773] exiting (success)
Feb 19 18:31:06 np0005624716.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 19 18:31:06 np0005624716.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 19 18:31:06 np0005624716.novalocal systemd[1]: Stopped Network Manager.
Feb 19 18:31:06 np0005624716.novalocal systemd[1]: Starting Network Manager...
Feb 19 18:31:06 np0005624716.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.8184] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ff499d4a-6be1-401f-8b91-3dc243d525b1)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.8186] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.8232] manager[0x561af4a65000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 19 18:31:06 np0005624716.novalocal systemd[1]: Starting Hostname Service...
Feb 19 18:31:06 np0005624716.novalocal systemd[1]: Started Hostname Service.
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9233] hostname: hostname: using hostnamed
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9234] hostname: static hostname changed from (none) to "np0005624716.novalocal"
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9238] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9244] manager[0x561af4a65000]: rfkill: Wi-Fi hardware radio set enabled
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9244] manager[0x561af4a65000]: rfkill: WWAN hardware radio set enabled
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9271] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9271] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9272] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9272] manager: Networking is enabled by state file
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9274] settings: Loaded settings plugin: keyfile (internal)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9278] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9303] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9312] dhcp: init: Using DHCP client 'internal'
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9315] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9320] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9325] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9334] device (lo): Activation: starting connection 'lo' (4563c9fb-fd0a-4471-af90-84b9f0d73c08)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9340] device (eth0): carrier: link connected
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9343] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9348] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9348] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9354] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9361] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9366] device (eth1): carrier: link connected
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9371] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9376] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (1cf3a24b-f7c9-3493-9032-4c304d6d9dc5) (indicated)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9377] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9381] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9389] device (eth1): Activation: starting connection 'Wired connection 1' (1cf3a24b-f7c9-3493-9032-4c304d6d9dc5)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9396] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 19 18:31:06 np0005624716.novalocal systemd[1]: Started Network Manager.
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9401] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9404] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9407] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9410] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9414] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9417] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9420] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9423] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9435] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9438] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9448] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9451] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9483] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9486] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9491] device (lo): Activation: successful, device activated.
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9500] dhcp4 (eth0): state changed new lease, address=38.102.83.219
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9507] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 19 18:31:06 np0005624716.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9593] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9621] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9623] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9627] manager: NetworkManager state is now CONNECTED_SITE
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9630] device (eth0): Activation: successful, device activated.
Feb 19 18:31:06 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525866.9635] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 19 18:31:06 np0005624716.novalocal sudo[7678]: pam_unix(sudo:session): session closed for user root
Feb 19 18:31:07 np0005624716.novalocal python3[7764]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-498a-2b1e-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:31:17 np0005624716.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 19 18:31:36 np0005624716.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.3856] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 19 18:31:52 np0005624716.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 19 18:31:52 np0005624716.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4156] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4160] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4168] device (eth1): Activation: successful, device activated.
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4175] manager: startup complete
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4176] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <warn>  [1771525912.4184] device (eth1): Activation: failed for connection 'Wired connection 1'
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4192] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Feb 19 18:31:52 np0005624716.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4327] dhcp4 (eth1): canceled DHCP transaction
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4329] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4329] dhcp4 (eth1): state changed no lease
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4348] policy: auto-activating connection 'ci-private-network' (4e789d3f-d1df-5b4d-a218-9665bd6d0d0e)
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4354] device (eth1): Activation: starting connection 'ci-private-network' (4e789d3f-d1df-5b4d-a218-9665bd6d0d0e)
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4356] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4362] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4372] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4384] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4432] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4439] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 19 18:31:52 np0005624716.novalocal NetworkManager[7684]: <info>  [1771525912.4454] device (eth1): Activation: successful, device activated.
Feb 19 18:32:02 np0005624716.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 19 18:32:07 np0005624716.novalocal sshd-session[4813]: Received disconnect from 38.102.83.114 port 58620:11: disconnected by user
Feb 19 18:32:07 np0005624716.novalocal sshd-session[4813]: Disconnected from user zuul 38.102.83.114 port 58620
Feb 19 18:32:07 np0005624716.novalocal sshd-session[4800]: pam_unix(sshd:session): session closed for user zuul
Feb 19 18:32:07 np0005624716.novalocal systemd-logind[822]: Session 1 logged out. Waiting for processes to exit.
Feb 19 18:32:13 np0005624716.novalocal systemd[4804]: Starting Mark boot as successful...
Feb 19 18:32:13 np0005624716.novalocal systemd[4804]: Finished Mark boot as successful.
Feb 19 18:32:28 np0005624716.novalocal sshd-session[7793]: Accepted publickey for zuul from 38.102.83.114 port 42068 ssh2: RSA SHA256:IX/qU16dJ1hklhs5lf1q2BDqrTapkjt2lWmuMJZ/HJ0
Feb 19 18:32:28 np0005624716.novalocal systemd-logind[822]: New session 3 of user zuul.
Feb 19 18:32:28 np0005624716.novalocal systemd[1]: Started Session 3 of User zuul.
Feb 19 18:32:28 np0005624716.novalocal sshd-session[7793]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 18:32:28 np0005624716.novalocal sudo[7872]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdfrkezetltligbinxmgwefxwpgmcjux ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 19 18:32:28 np0005624716.novalocal sudo[7872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:32:28 np0005624716.novalocal python3[7874]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:32:28 np0005624716.novalocal sudo[7872]: pam_unix(sudo:session): session closed for user root
Feb 19 18:32:29 np0005624716.novalocal sudo[7945]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slqrwupbmjeccxfwovwebroljmraeavs ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 19 18:32:29 np0005624716.novalocal sudo[7945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:32:29 np0005624716.novalocal python3[7947]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771525948.6354022-309-47590500162618/source _original_basename=tmpnrr83tsl follow=False checksum=faefe9023cbe92de53c3baf261142a475f05e087 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:32:29 np0005624716.novalocal sudo[7945]: pam_unix(sudo:session): session closed for user root
Feb 19 18:32:32 np0005624716.novalocal sshd-session[7796]: Connection closed by 38.102.83.114 port 42068
Feb 19 18:32:32 np0005624716.novalocal sshd-session[7793]: pam_unix(sshd:session): session closed for user zuul
Feb 19 18:32:32 np0005624716.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Feb 19 18:32:32 np0005624716.novalocal systemd-logind[822]: Session 3 logged out. Waiting for processes to exit.
Feb 19 18:32:32 np0005624716.novalocal systemd-logind[822]: Removed session 3.
Feb 19 18:33:29 np0005624716.novalocal sshd-session[7972]: Connection closed by authenticating user root 165.22.221.82 port 41420 [preauth]
Feb 19 18:34:02 np0005624716.novalocal sshd-session[7974]: Invalid user sftptest from 43.166.137.151 port 41326
Feb 19 18:34:02 np0005624716.novalocal sshd-session[7974]: Received disconnect from 43.166.137.151 port 41326:11: Bye Bye [preauth]
Feb 19 18:34:02 np0005624716.novalocal sshd-session[7974]: Disconnected from invalid user sftptest 43.166.137.151 port 41326 [preauth]
Feb 19 18:34:13 np0005624716.novalocal sshd-session[7976]: Invalid user gitea from 138.255.157.62 port 61464
Feb 19 18:34:13 np0005624716.novalocal sshd-session[7976]: Received disconnect from 138.255.157.62 port 61464:11: Bye Bye [preauth]
Feb 19 18:34:13 np0005624716.novalocal sshd-session[7976]: Disconnected from invalid user gitea 138.255.157.62 port 61464 [preauth]
Feb 19 18:34:25 np0005624716.novalocal sshd-session[7978]: Received disconnect from 45.148.10.152 port 45474:11:  [preauth]
Feb 19 18:34:25 np0005624716.novalocal sshd-session[7978]: Disconnected from authenticating user root 45.148.10.152 port 45474 [preauth]
Feb 19 18:34:43 np0005624716.novalocal sshd-session[7981]: Connection closed by authenticating user root 165.22.221.82 port 60328 [preauth]
Feb 19 18:35:13 np0005624716.novalocal systemd[4804]: Created slice User Background Tasks Slice.
Feb 19 18:35:13 np0005624716.novalocal systemd[4804]: Starting Cleanup of User's Temporary Files and Directories...
Feb 19 18:35:13 np0005624716.novalocal systemd[4804]: Finished Cleanup of User's Temporary Files and Directories.
Feb 19 18:35:48 np0005624716.novalocal sshd-session[7985]: Connection closed by authenticating user root 165.22.221.82 port 39104 [preauth]
Feb 19 18:36:47 np0005624716.novalocal sshd-session[7987]: Connection closed by authenticating user root 165.22.221.82 port 40310 [preauth]
Feb 19 18:37:49 np0005624716.novalocal sshd-session[7991]: Connection closed by authenticating user root 165.22.221.82 port 48032 [preauth]
Feb 19 18:38:01 np0005624716.novalocal sshd-session[7993]: Invalid user claude from 27.50.25.190 port 35508
Feb 19 18:38:02 np0005624716.novalocal sshd-session[7993]: Received disconnect from 27.50.25.190 port 35508:11: Bye Bye [preauth]
Feb 19 18:38:02 np0005624716.novalocal sshd-session[7993]: Disconnected from invalid user claude 27.50.25.190 port 35508 [preauth]
Feb 19 18:38:15 np0005624716.novalocal sshd-session[7996]: Accepted publickey for zuul from 38.102.83.114 port 51200 ssh2: RSA SHA256:IX/qU16dJ1hklhs5lf1q2BDqrTapkjt2lWmuMJZ/HJ0
Feb 19 18:38:15 np0005624716.novalocal systemd-logind[822]: New session 4 of user zuul.
Feb 19 18:38:15 np0005624716.novalocal systemd[1]: Started Session 4 of User zuul.
Feb 19 18:38:15 np0005624716.novalocal sshd-session[7996]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 18:38:16 np0005624716.novalocal sudo[8023]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvmkdaqatcgcibebxxhvfzinabtyukwb ; /usr/bin/python3'
Feb 19 18:38:16 np0005624716.novalocal sudo[8023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:16 np0005624716.novalocal python3[8025]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-7d0f-e24a-000000002168-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:38:16 np0005624716.novalocal sudo[8023]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:16 np0005624716.novalocal sudo[8051]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eijlzzzuynhofqkcrgsohvbyfxnniyws ; /usr/bin/python3'
Feb 19 18:38:16 np0005624716.novalocal sudo[8051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:16 np0005624716.novalocal python3[8053]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:38:16 np0005624716.novalocal sudo[8051]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:16 np0005624716.novalocal sudo[8077]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhqjkcuroyumprgueoqqpuagseqnxnhk ; /usr/bin/python3'
Feb 19 18:38:16 np0005624716.novalocal sudo[8077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:16 np0005624716.novalocal python3[8080]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:38:16 np0005624716.novalocal sudo[8077]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:16 np0005624716.novalocal sudo[8104]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mglsygwenkqzikhmupyrqubcsgatmhxw ; /usr/bin/python3'
Feb 19 18:38:16 np0005624716.novalocal sudo[8104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:17 np0005624716.novalocal python3[8106]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:38:17 np0005624716.novalocal sudo[8104]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:17 np0005624716.novalocal sudo[8130]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htjbtdwsywhajhqimwvhqshwgpsugdsu ; /usr/bin/python3'
Feb 19 18:38:17 np0005624716.novalocal sudo[8130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:17 np0005624716.novalocal python3[8132]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:38:17 np0005624716.novalocal sudo[8130]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:17 np0005624716.novalocal sudo[8156]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvrdguoflykylvzjehskknkcpjrtabud ; /usr/bin/python3'
Feb 19 18:38:17 np0005624716.novalocal sudo[8156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:17 np0005624716.novalocal python3[8158]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:38:17 np0005624716.novalocal sudo[8156]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:18 np0005624716.novalocal sudo[8234]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtcvhgjzvrpqyztwausfuwlylzatagxt ; /usr/bin/python3'
Feb 19 18:38:18 np0005624716.novalocal sudo[8234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:18 np0005624716.novalocal python3[8236]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:38:18 np0005624716.novalocal sudo[8234]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:18 np0005624716.novalocal sudo[8307]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swzyfzkesvxobwzclfrupjjuvcveehfs ; /usr/bin/python3'
Feb 19 18:38:18 np0005624716.novalocal sudo[8307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:18 np0005624716.novalocal python3[8309]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771526298.052335-513-275409644291631/source _original_basename=tmpybedy2ol follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:38:18 np0005624716.novalocal sudo[8307]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:19 np0005624716.novalocal sudo[8357]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diyfkzizwvixlypclndpnauecvsbnfqf ; /usr/bin/python3'
Feb 19 18:38:19 np0005624716.novalocal sudo[8357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:19 np0005624716.novalocal python3[8359]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 19 18:38:19 np0005624716.novalocal systemd[1]: Reloading.
Feb 19 18:38:19 np0005624716.novalocal systemd-rc-local-generator[8380]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 18:38:19 np0005624716.novalocal sudo[8357]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:20 np0005624716.novalocal sudo[8421]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cykuvyxbxtbqvonxauyutjcyacbmwxxo ; /usr/bin/python3'
Feb 19 18:38:20 np0005624716.novalocal sudo[8421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:21 np0005624716.novalocal python3[8423]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Feb 19 18:38:21 np0005624716.novalocal sudo[8421]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:21 np0005624716.novalocal sudo[8447]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzlpbtfltfptzttplkomifexmdfufsne ; /usr/bin/python3'
Feb 19 18:38:21 np0005624716.novalocal sudo[8447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:21 np0005624716.novalocal python3[8449]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:38:21 np0005624716.novalocal sudo[8447]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:21 np0005624716.novalocal sudo[8475]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epdaaruepkcltwvqszjxyycdckeussky ; /usr/bin/python3'
Feb 19 18:38:21 np0005624716.novalocal sudo[8475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:21 np0005624716.novalocal python3[8477]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:38:21 np0005624716.novalocal sudo[8475]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:21 np0005624716.novalocal sudo[8503]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkyjmtdmjhucwhmbihocxgnmtyysveso ; /usr/bin/python3'
Feb 19 18:38:21 np0005624716.novalocal sudo[8503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:21 np0005624716.novalocal python3[8505]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:38:21 np0005624716.novalocal sudo[8503]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:22 np0005624716.novalocal sudo[8531]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytowzxbxuzexbolaoafbrugzdsuplmtk ; /usr/bin/python3'
Feb 19 18:38:22 np0005624716.novalocal sudo[8531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:22 np0005624716.novalocal python3[8533]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:38:22 np0005624716.novalocal sudo[8531]: pam_unix(sudo:session): session closed for user root
Feb 19 18:38:22 np0005624716.novalocal python3[8560]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-7d0f-e24a-00000000216f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:38:23 np0005624716.novalocal python3[8590]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 19 18:38:25 np0005624716.novalocal sshd-session[7999]: Connection closed by 38.102.83.114 port 51200
Feb 19 18:38:25 np0005624716.novalocal sshd-session[7996]: pam_unix(sshd:session): session closed for user zuul
Feb 19 18:38:25 np0005624716.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Feb 19 18:38:25 np0005624716.novalocal systemd[1]: session-4.scope: Consumed 3.939s CPU time.
Feb 19 18:38:25 np0005624716.novalocal systemd-logind[822]: Session 4 logged out. Waiting for processes to exit.
Feb 19 18:38:25 np0005624716.novalocal systemd-logind[822]: Removed session 4.
Feb 19 18:38:27 np0005624716.novalocal sshd-session[8594]: Accepted publickey for zuul from 38.102.83.114 port 45362 ssh2: RSA SHA256:IX/qU16dJ1hklhs5lf1q2BDqrTapkjt2lWmuMJZ/HJ0
Feb 19 18:38:27 np0005624716.novalocal systemd-logind[822]: New session 5 of user zuul.
Feb 19 18:38:27 np0005624716.novalocal systemd[1]: Started Session 5 of User zuul.
Feb 19 18:38:27 np0005624716.novalocal sshd-session[8594]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 18:38:27 np0005624716.novalocal sudo[8621]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iddnhmhrjcjromqkoseuuebqfewqsxxr ; /usr/bin/python3'
Feb 19 18:38:27 np0005624716.novalocal sudo[8621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:38:27 np0005624716.novalocal python3[8623]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 19 18:38:36 np0005624716.novalocal setsebool[8659]: The virt_use_nfs policy boolean was changed to 1 by root
Feb 19 18:38:36 np0005624716.novalocal setsebool[8659]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Feb 19 18:38:47 np0005624716.novalocal kernel: SELinux:  Converting 385 SID table entries...
Feb 19 18:38:47 np0005624716.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 19 18:38:47 np0005624716.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 19 18:38:47 np0005624716.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 19 18:38:47 np0005624716.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 19 18:38:47 np0005624716.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 19 18:38:47 np0005624716.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 19 18:38:47 np0005624716.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 19 18:38:52 np0005624716.novalocal sshd-session[8686]: Connection closed by authenticating user root 165.22.221.82 port 60506 [preauth]
Feb 19 18:38:57 np0005624716.novalocal kernel: SELinux:  Converting 388 SID table entries...
Feb 19 18:38:57 np0005624716.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 19 18:38:57 np0005624716.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 19 18:38:57 np0005624716.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 19 18:38:57 np0005624716.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 19 18:38:57 np0005624716.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 19 18:38:57 np0005624716.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 19 18:38:57 np0005624716.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 19 18:39:15 np0005624716.novalocal dbus-broker-launch[804]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 19 18:39:15 np0005624716.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 19 18:39:15 np0005624716.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 19 18:39:15 np0005624716.novalocal systemd[1]: Reloading.
Feb 19 18:39:15 np0005624716.novalocal systemd-rc-local-generator[9449]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 18:39:15 np0005624716.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 19 18:39:16 np0005624716.novalocal sudo[8621]: pam_unix(sudo:session): session closed for user root
Feb 19 18:39:24 np0005624716.novalocal python3[16662]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163efc-24cc-0ce5-688c-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:39:25 np0005624716.novalocal kernel: evm: overlay not supported
Feb 19 18:39:25 np0005624716.novalocal systemd[4804]: Starting D-Bus User Message Bus...
Feb 19 18:39:25 np0005624716.novalocal dbus-broker-launch[17449]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 19 18:39:25 np0005624716.novalocal dbus-broker-launch[17449]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 19 18:39:25 np0005624716.novalocal systemd[4804]: Started D-Bus User Message Bus.
Feb 19 18:39:25 np0005624716.novalocal dbus-broker-lau[17449]: Ready
Feb 19 18:39:25 np0005624716.novalocal systemd[4804]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 19 18:39:25 np0005624716.novalocal systemd[4804]: Created slice Slice /user.
Feb 19 18:39:25 np0005624716.novalocal systemd[4804]: podman-17366.scope: unit configures an IP firewall, but not running as root.
Feb 19 18:39:25 np0005624716.novalocal systemd[4804]: (This warning is only shown for the first unit using IP firewalling.)
Feb 19 18:39:25 np0005624716.novalocal systemd[4804]: Started podman-17366.scope.
Feb 19 18:39:26 np0005624716.novalocal systemd[4804]: Started podman-pause-a45ca69d.scope.
Feb 19 18:39:27 np0005624716.novalocal sudo[18371]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxznzlwlrkjbxsolyyuquiqsupweukiu ; /usr/bin/python3'
Feb 19 18:39:27 np0005624716.novalocal sudo[18371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:39:27 np0005624716.novalocal python3[18387]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.75:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.75:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:39:27 np0005624716.novalocal python3[18387]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Feb 19 18:39:27 np0005624716.novalocal sudo[18371]: pam_unix(sudo:session): session closed for user root
Feb 19 18:39:28 np0005624716.novalocal sshd-session[8597]: Connection closed by 38.102.83.114 port 45362
Feb 19 18:39:28 np0005624716.novalocal sshd-session[8594]: pam_unix(sshd:session): session closed for user zuul
Feb 19 18:39:28 np0005624716.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Feb 19 18:39:28 np0005624716.novalocal systemd[1]: session-5.scope: Consumed 41.462s CPU time.
Feb 19 18:39:28 np0005624716.novalocal systemd-logind[822]: Session 5 logged out. Waiting for processes to exit.
Feb 19 18:39:28 np0005624716.novalocal systemd-logind[822]: Removed session 5.
Feb 19 18:39:50 np0005624716.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 19 18:39:50 np0005624716.novalocal systemd[1]: Finished man-db-cache-update.service.
Feb 19 18:39:50 np0005624716.novalocal systemd[1]: man-db-cache-update.service: Consumed 37.667s CPU time.
Feb 19 18:39:50 np0005624716.novalocal systemd[1]: run-r2300340ee924443491ce53719a283972.service: Deactivated successfully.
Feb 19 18:39:51 np0005624716.novalocal sshd-session[30173]: Connection closed by 38.102.83.2 port 34926 [preauth]
Feb 19 18:39:51 np0005624716.novalocal sshd-session[30174]: Connection closed by 38.102.83.2 port 34934 [preauth]
Feb 19 18:39:51 np0005624716.novalocal sshd-session[30176]: Unable to negotiate with 38.102.83.2 port 34946: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 19 18:39:51 np0005624716.novalocal sshd-session[30172]: Unable to negotiate with 38.102.83.2 port 34948: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 19 18:39:51 np0005624716.novalocal sshd-session[30175]: Unable to negotiate with 38.102.83.2 port 34950: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 19 18:39:55 np0005624716.novalocal sshd-session[30183]: Accepted publickey for zuul from 38.102.83.114 port 48480 ssh2: RSA SHA256:IX/qU16dJ1hklhs5lf1q2BDqrTapkjt2lWmuMJZ/HJ0
Feb 19 18:39:55 np0005624716.novalocal systemd-logind[822]: New session 6 of user zuul.
Feb 19 18:39:55 np0005624716.novalocal systemd[1]: Started Session 6 of User zuul.
Feb 19 18:39:55 np0005624716.novalocal sshd-session[30183]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 18:39:56 np0005624716.novalocal python3[30211]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ3OMShnOB50Wh8R42rZVSVKNfcu1jj6VI9QC3fvouTNGdq/kjKk30r3m8XtCxlED/GgH/hB2oeEypX795y5w6A= zuul@np0005624715.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:39:56 np0005624716.novalocal sudo[30236]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwfdrcijimxtpxzxrbgamopjtfxmhqkw ; /usr/bin/python3'
Feb 19 18:39:56 np0005624716.novalocal sudo[30236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:39:56 np0005624716.novalocal irqbalance[817]: Cannot change IRQ 30 affinity: Operation not permitted
Feb 19 18:39:56 np0005624716.novalocal irqbalance[817]: IRQ 30 affinity is now unmanaged
Feb 19 18:39:56 np0005624716.novalocal python3[30238]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ3OMShnOB50Wh8R42rZVSVKNfcu1jj6VI9QC3fvouTNGdq/kjKk30r3m8XtCxlED/GgH/hB2oeEypX795y5w6A= zuul@np0005624715.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:39:56 np0005624716.novalocal sudo[30236]: pam_unix(sudo:session): session closed for user root
Feb 19 18:39:57 np0005624716.novalocal sudo[30262]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jactzldvxllfyquoucxlnjtrfeuyafws ; /usr/bin/python3'
Feb 19 18:39:57 np0005624716.novalocal sudo[30262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:39:57 np0005624716.novalocal python3[30264]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005624716.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 19 18:39:57 np0005624716.novalocal useradd[30266]: new group: name=cloud-admin, GID=1002
Feb 19 18:39:57 np0005624716.novalocal useradd[30266]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Feb 19 18:39:57 np0005624716.novalocal sudo[30262]: pam_unix(sudo:session): session closed for user root
Feb 19 18:39:57 np0005624716.novalocal sshd-session[30210]: Connection closed by authenticating user root 165.22.221.82 port 51764 [preauth]
Feb 19 18:39:57 np0005624716.novalocal sudo[30296]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unjletajhfmgxtmdtozirnxzdqovlvzi ; /usr/bin/python3'
Feb 19 18:39:57 np0005624716.novalocal sudo[30296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:39:57 np0005624716.novalocal python3[30298]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ3OMShnOB50Wh8R42rZVSVKNfcu1jj6VI9QC3fvouTNGdq/kjKk30r3m8XtCxlED/GgH/hB2oeEypX795y5w6A= zuul@np0005624715.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 19 18:39:57 np0005624716.novalocal sudo[30296]: pam_unix(sudo:session): session closed for user root
Feb 19 18:39:58 np0005624716.novalocal sudo[30374]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywfqkofikwqxudanizimeuvmzhvozaqm ; /usr/bin/python3'
Feb 19 18:39:58 np0005624716.novalocal sudo[30374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:39:58 np0005624716.novalocal python3[30376]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:39:58 np0005624716.novalocal sudo[30374]: pam_unix(sudo:session): session closed for user root
Feb 19 18:39:58 np0005624716.novalocal sudo[30447]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfbtfubkdldxzoidpvkauknudbfhvauq ; /usr/bin/python3'
Feb 19 18:39:58 np0005624716.novalocal sudo[30447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:39:58 np0005624716.novalocal python3[30449]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771526398.046159-151-206018912884552/source _original_basename=tmpjdqctbk3 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:39:58 np0005624716.novalocal sudo[30447]: pam_unix(sudo:session): session closed for user root
Feb 19 18:39:59 np0005624716.novalocal sudo[30497]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkbbkeswqvvazocfmswashonuphvjibc ; /usr/bin/python3'
Feb 19 18:39:59 np0005624716.novalocal sudo[30497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:40:00 np0005624716.novalocal python3[30499]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Feb 19 18:40:00 np0005624716.novalocal systemd[1]: Starting Hostname Service...
Feb 19 18:40:00 np0005624716.novalocal systemd[1]: Started Hostname Service.
Feb 19 18:40:00 np0005624716.novalocal systemd-hostnamed[30503]: Changed pretty hostname to 'compute-0'
Feb 19 18:40:00 compute-0 systemd-hostnamed[30503]: Hostname set to <compute-0> (static)
Feb 19 18:40:00 compute-0 NetworkManager[7684]: <info>  [1771526400.1878] hostname: static hostname changed from "np0005624716.novalocal" to "compute-0"
Feb 19 18:40:00 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 19 18:40:00 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 19 18:40:01 compute-0 sudo[30497]: pam_unix(sudo:session): session closed for user root
Feb 19 18:40:01 compute-0 sshd-session[30186]: Connection closed by 38.102.83.114 port 48480
Feb 19 18:40:01 compute-0 sshd-session[30183]: pam_unix(sshd:session): session closed for user zuul
Feb 19 18:40:01 compute-0 systemd-logind[822]: Session 6 logged out. Waiting for processes to exit.
Feb 19 18:40:01 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Feb 19 18:40:01 compute-0 systemd[1]: session-6.scope: Consumed 2.137s CPU time.
Feb 19 18:40:01 compute-0 systemd-logind[822]: Removed session 6.
Feb 19 18:40:10 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 19 18:40:30 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 19 18:40:37 compute-0 sshd-session[30519]: Invalid user n8n from 43.166.137.151 port 47912
Feb 19 18:40:37 compute-0 sshd-session[30519]: Received disconnect from 43.166.137.151 port 47912:11: Bye Bye [preauth]
Feb 19 18:40:37 compute-0 sshd-session[30519]: Disconnected from invalid user n8n 43.166.137.151 port 47912 [preauth]
Feb 19 18:41:02 compute-0 sshd-session[30521]: Connection closed by authenticating user root 165.22.221.82 port 60438 [preauth]
Feb 19 18:41:21 compute-0 sshd-session[30523]: Connection closed by 45.174.242.246 port 43720
Feb 19 18:41:21 compute-0 sshd-session[30524]: Invalid user a from 45.174.242.246 port 43732
Feb 19 18:41:21 compute-0 sshd-session[30524]: Connection closed by invalid user a 45.174.242.246 port 43732 [preauth]
Feb 19 18:41:33 compute-0 sshd-session[30526]: Invalid user teamspeak3 from 138.255.157.62 port 30847
Feb 19 18:41:34 compute-0 sshd-session[30526]: Received disconnect from 138.255.157.62 port 30847:11: Bye Bye [preauth]
Feb 19 18:41:34 compute-0 sshd-session[30526]: Disconnected from invalid user teamspeak3 138.255.157.62 port 30847 [preauth]
Feb 19 18:42:02 compute-0 sshd-session[30533]: Received disconnect from 91.224.92.78 port 32476:11:  [preauth]
Feb 19 18:42:02 compute-0 sshd-session[30533]: Disconnected from authenticating user root 91.224.92.78 port 32476 [preauth]
Feb 19 18:42:02 compute-0 sshd-session[30530]: Connection closed by authenticating user root 165.22.221.82 port 37764 [preauth]
Feb 19 18:42:45 compute-0 sshd-session[30535]: Invalid user gitea from 27.50.25.190 port 43974
Feb 19 18:42:45 compute-0 sshd-session[30535]: Received disconnect from 27.50.25.190 port 43974:11: Bye Bye [preauth]
Feb 19 18:42:45 compute-0 sshd-session[30535]: Disconnected from invalid user gitea 27.50.25.190 port 43974 [preauth]
Feb 19 18:43:03 compute-0 sshd-session[30537]: Connection closed by authenticating user root 165.22.221.82 port 52166 [preauth]
Feb 19 18:43:36 compute-0 sshd-session[30539]: Received disconnect from 43.166.137.151 port 51138:11: Bye Bye [preauth]
Feb 19 18:43:36 compute-0 sshd-session[30539]: Disconnected from authenticating user root 43.166.137.151 port 51138 [preauth]
Feb 19 18:44:04 compute-0 sshd-session[30541]: Connection closed by authenticating user root 165.22.221.82 port 42284 [preauth]
Feb 19 18:44:04 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Feb 19 18:44:04 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Feb 19 18:44:04 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Feb 19 18:44:04 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Feb 19 18:44:14 compute-0 sshd-session[30546]: Accepted publickey for zuul from 38.102.83.2 port 48432 ssh2: RSA SHA256:IX/qU16dJ1hklhs5lf1q2BDqrTapkjt2lWmuMJZ/HJ0
Feb 19 18:44:14 compute-0 systemd-logind[822]: New session 7 of user zuul.
Feb 19 18:44:14 compute-0 systemd[1]: Started Session 7 of User zuul.
Feb 19 18:44:14 compute-0 sshd-session[30546]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 18:44:14 compute-0 python3[30622]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 18:44:15 compute-0 sudo[30736]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jenwqhchiqwthpqivqrakolybdndgurm ; /usr/bin/python3'
Feb 19 18:44:15 compute-0 sudo[30736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:16 compute-0 python3[30738]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:44:16 compute-0 sudo[30736]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:16 compute-0 sudo[30809]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxlhvpvslpqcyiwybrwsiybdjrysmbpt ; /usr/bin/python3'
Feb 19 18:44:16 compute-0 sudo[30809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:16 compute-0 python3[30811]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771526655.8062446-34430-78648386340590/source mode=0755 _original_basename=delorean.repo follow=False checksum=f8af988c6049bbc5123c33e56608f7a6dd7cf5c7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:44:16 compute-0 sudo[30809]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:16 compute-0 sudo[30835]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcuanmmrtyrslcxwydjjrufmieemmisq ; /usr/bin/python3'
Feb 19 18:44:16 compute-0 sudo[30835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:16 compute-0 python3[30837]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:44:16 compute-0 sudo[30835]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:16 compute-0 sudo[30908]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbskddehpqxofomdzbsdpkusbkbfaqdg ; /usr/bin/python3'
Feb 19 18:44:16 compute-0 sudo[30908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:17 compute-0 python3[30910]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771526655.8062446-34430-78648386340590/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=2c5ad31b3cd5c5b96a9995d83e342833f9bd7020 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:44:17 compute-0 sudo[30908]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:17 compute-0 sudo[30934]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agjkpnovvjgsdopwiqcnetnwgijhglzw ; /usr/bin/python3'
Feb 19 18:44:17 compute-0 sudo[30934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:17 compute-0 python3[30936]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:44:17 compute-0 sudo[30934]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:17 compute-0 sudo[31007]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvteknrvlgvwayubwqciskwijxflxyau ; /usr/bin/python3'
Feb 19 18:44:17 compute-0 sudo[31007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:17 compute-0 python3[31009]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771526655.8062446-34430-78648386340590/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:44:17 compute-0 sudo[31007]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:17 compute-0 sudo[31033]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjhgcvmvscgmttesfnioqxybxayprpxf ; /usr/bin/python3'
Feb 19 18:44:17 compute-0 sudo[31033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:17 compute-0 python3[31035]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:44:17 compute-0 sudo[31033]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:17 compute-0 sudo[31106]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tryueyfmdnffcuyylrydmqpmqcdghwoi ; /usr/bin/python3'
Feb 19 18:44:17 compute-0 sudo[31106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:18 compute-0 python3[31108]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771526655.8062446-34430-78648386340590/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:44:18 compute-0 sudo[31106]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:18 compute-0 sudo[31132]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iozsntowimvvobsauqblcanlrssfmvrw ; /usr/bin/python3'
Feb 19 18:44:18 compute-0 sudo[31132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:18 compute-0 python3[31134]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:44:18 compute-0 sudo[31132]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:18 compute-0 sudo[31205]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsgrznhhecphyirwvtbprzmxyggeezon ; /usr/bin/python3'
Feb 19 18:44:18 compute-0 sudo[31205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:18 compute-0 python3[31207]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771526655.8062446-34430-78648386340590/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:44:18 compute-0 sudo[31205]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:18 compute-0 sudo[31231]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfxtswdcyclspgdpofgbqfmhxmnxpfka ; /usr/bin/python3'
Feb 19 18:44:18 compute-0 sudo[31231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:18 compute-0 python3[31233]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:44:18 compute-0 sudo[31231]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:19 compute-0 sudo[31304]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssggpfslqwehcwrnttrmnwqgfhrnymct ; /usr/bin/python3'
Feb 19 18:44:19 compute-0 sudo[31304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:19 compute-0 python3[31306]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771526655.8062446-34430-78648386340590/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:44:19 compute-0 sudo[31304]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:19 compute-0 sudo[31330]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocfzabypwpzmfnpojbmauuurefmpagck ; /usr/bin/python3'
Feb 19 18:44:19 compute-0 sudo[31330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:19 compute-0 python3[31332]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:44:19 compute-0 sudo[31330]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:19 compute-0 sudo[31403]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvnwvozsyusdukuprrffdmkirvpzdcrt ; /usr/bin/python3'
Feb 19 18:44:19 compute-0 sudo[31403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:19 compute-0 python3[31405]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771526655.8062446-34430-78648386340590/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=dd69519341b7eaa85a6b34131419cd29ef086450 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:44:19 compute-0 sudo[31403]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:19 compute-0 sudo[31429]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esjmvptbozwhlpatidqgmhhhiaiqmuvu ; /usr/bin/python3'
Feb 19 18:44:19 compute-0 sudo[31429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:20 compute-0 python3[31431]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/gating.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 18:44:20 compute-0 sudo[31429]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:20 compute-0 sudo[31502]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glrrznxqupqgfqoypuyammybczqsvtvr ; /usr/bin/python3'
Feb 19 18:44:20 compute-0 sudo[31502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:44:20 compute-0 python3[31504]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771526655.8062446-34430-78648386340590/source mode=0755 _original_basename=gating.repo follow=False checksum=56372a7d2fb249afabd6ce2fdb23be69e69d1efc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:44:20 compute-0 sudo[31502]: pam_unix(sudo:session): session closed for user root
Feb 19 18:44:25 compute-0 sshd-session[31529]: Unable to negotiate with 192.168.122.11 port 34388: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 19 18:44:25 compute-0 sshd-session[31530]: Unable to negotiate with 192.168.122.11 port 34402: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 19 18:44:25 compute-0 sshd-session[31532]: Connection closed by 192.168.122.11 port 34382 [preauth]
Feb 19 18:44:25 compute-0 sshd-session[31533]: Connection closed by 192.168.122.11 port 34374 [preauth]
Feb 19 18:44:25 compute-0 sshd-session[31531]: Unable to negotiate with 192.168.122.11 port 34414: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 19 18:45:07 compute-0 sshd-session[31540]: Connection closed by authenticating user root 165.22.221.82 port 55676 [preauth]
Feb 19 18:45:13 compute-0 sshd-session[31542]: Invalid user shreyas from 138.255.157.62 port 10706
Feb 19 18:45:13 compute-0 sshd-session[31542]: Received disconnect from 138.255.157.62 port 10706:11: Bye Bye [preauth]
Feb 19 18:45:13 compute-0 sshd-session[31542]: Disconnected from invalid user shreyas 138.255.157.62 port 10706 [preauth]
Feb 19 18:45:18 compute-0 python3[31567]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:46:09 compute-0 sshd-session[31570]: Connection closed by authenticating user root 165.22.221.82 port 59880 [preauth]
Feb 19 18:46:42 compute-0 sshd-session[31572]: Invalid user claude from 43.166.137.151 port 56510
Feb 19 18:46:42 compute-0 sshd-session[31572]: Received disconnect from 43.166.137.151 port 56510:11: Bye Bye [preauth]
Feb 19 18:46:42 compute-0 sshd-session[31572]: Disconnected from invalid user claude 43.166.137.151 port 56510 [preauth]
Feb 19 18:46:46 compute-0 sshd-session[31574]: Invalid user sftptest from 27.50.25.190 port 46328
Feb 19 18:46:47 compute-0 sshd-session[31574]: Received disconnect from 27.50.25.190 port 46328:11: Bye Bye [preauth]
Feb 19 18:46:47 compute-0 sshd-session[31574]: Disconnected from invalid user sftptest 27.50.25.190 port 46328 [preauth]
Feb 19 18:47:09 compute-0 sshd-session[31576]: Connection closed by authenticating user root 165.22.221.82 port 60700 [preauth]
Feb 19 18:48:07 compute-0 sshd-session[31579]: Connection closed by authenticating user root 165.22.221.82 port 46092 [preauth]
Feb 19 18:48:47 compute-0 sshd-session[31581]: Received disconnect from 138.255.157.62 port 24378:11: Bye Bye [preauth]
Feb 19 18:48:47 compute-0 sshd-session[31581]: Disconnected from authenticating user root 138.255.157.62 port 24378 [preauth]
Feb 19 18:49:05 compute-0 sshd-session[31583]: Connection closed by authenticating user root 165.22.221.82 port 43114 [preauth]
Feb 19 18:49:16 compute-0 sshd-session[31585]: Received disconnect from 45.148.10.151 port 49348:11:  [preauth]
Feb 19 18:49:16 compute-0 sshd-session[31585]: Disconnected from authenticating user root 45.148.10.151 port 49348 [preauth]
Feb 19 18:49:27 compute-0 sshd-session[31587]: Invalid user gitea from 43.166.137.151 port 49566
Feb 19 18:49:27 compute-0 sshd-session[31587]: Received disconnect from 43.166.137.151 port 49566:11: Bye Bye [preauth]
Feb 19 18:49:27 compute-0 sshd-session[31587]: Disconnected from invalid user gitea 43.166.137.151 port 49566 [preauth]
Feb 19 18:50:05 compute-0 sshd-session[31589]: Connection closed by authenticating user root 165.22.221.82 port 43748 [preauth]
Feb 19 18:50:18 compute-0 sshd-session[30549]: Received disconnect from 38.102.83.2 port 48432:11: disconnected by user
Feb 19 18:50:18 compute-0 sshd-session[30549]: Disconnected from user zuul 38.102.83.2 port 48432
Feb 19 18:50:18 compute-0 sshd-session[30546]: pam_unix(sshd:session): session closed for user zuul
Feb 19 18:50:18 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Feb 19 18:50:18 compute-0 systemd[1]: session-7.scope: Consumed 4.729s CPU time.
Feb 19 18:50:18 compute-0 systemd-logind[822]: Session 7 logged out. Waiting for processes to exit.
Feb 19 18:50:18 compute-0 systemd-logind[822]: Removed session 7.
Feb 19 18:50:34 compute-0 sshd-session[31591]: Invalid user mailuser from 27.50.25.190 port 37986
Feb 19 18:50:34 compute-0 sshd-session[31591]: Received disconnect from 27.50.25.190 port 37986:11: Bye Bye [preauth]
Feb 19 18:50:34 compute-0 sshd-session[31591]: Disconnected from invalid user mailuser 27.50.25.190 port 37986 [preauth]
Feb 19 18:51:10 compute-0 sshd-session[31594]: Connection closed by authenticating user root 165.22.221.82 port 36212 [preauth]
Feb 19 18:52:13 compute-0 sshd-session[31597]: Connection closed by authenticating user root 165.22.221.82 port 38810 [preauth]
Feb 19 18:52:15 compute-0 sshd-session[31599]: Received disconnect from 138.255.157.62 port 39706:11: Bye Bye [preauth]
Feb 19 18:52:15 compute-0 sshd-session[31599]: Disconnected from authenticating user root 138.255.157.62 port 39706 [preauth]
Feb 19 18:52:20 compute-0 sshd-session[31601]: Invalid user teamspeak3 from 43.166.137.151 port 60780
Feb 19 18:52:20 compute-0 sshd-session[31601]: Received disconnect from 43.166.137.151 port 60780:11: Bye Bye [preauth]
Feb 19 18:52:20 compute-0 sshd-session[31601]: Disconnected from invalid user teamspeak3 43.166.137.151 port 60780 [preauth]
Feb 19 18:52:35 compute-0 sshd-session[31603]: Received disconnect from 45.148.10.152 port 17288:11:  [preauth]
Feb 19 18:52:35 compute-0 sshd-session[31603]: Disconnected from authenticating user root 45.148.10.152 port 17288 [preauth]
Feb 19 18:53:17 compute-0 sshd-session[31605]: Connection closed by authenticating user root 165.22.221.82 port 55950 [preauth]
Feb 19 18:54:17 compute-0 sshd-session[31607]: Invalid user admin from 165.22.221.82 port 39584
Feb 19 18:54:17 compute-0 sshd-session[31607]: Connection closed by invalid user admin 165.22.221.82 port 39584 [preauth]
Feb 19 18:54:33 compute-0 sshd-session[31609]: Received disconnect from 27.50.25.190 port 36944:11: Bye Bye [preauth]
Feb 19 18:54:33 compute-0 sshd-session[31609]: Disconnected from authenticating user root 27.50.25.190 port 36944 [preauth]
Feb 19 18:55:13 compute-0 sshd-session[31612]: Invalid user oracle from 43.166.137.151 port 41602
Feb 19 18:55:13 compute-0 sshd-session[31612]: Received disconnect from 43.166.137.151 port 41602:11: Bye Bye [preauth]
Feb 19 18:55:13 compute-0 sshd-session[31612]: Disconnected from invalid user oracle 43.166.137.151 port 41602 [preauth]
Feb 19 18:55:16 compute-0 sshd-session[31614]: Invalid user admin from 165.22.221.82 port 42406
Feb 19 18:55:16 compute-0 sshd-session[31614]: Connection closed by invalid user admin 165.22.221.82 port 42406 [preauth]
Feb 19 18:55:48 compute-0 sshd-session[31616]: Received disconnect from 138.255.157.62 port 62471:11: Bye Bye [preauth]
Feb 19 18:55:48 compute-0 sshd-session[31616]: Disconnected from authenticating user root 138.255.157.62 port 62471 [preauth]
Feb 19 18:56:16 compute-0 sshd-session[31618]: Invalid user admin from 165.22.221.82 port 33762
Feb 19 18:56:16 compute-0 sshd-session[31618]: Connection closed by invalid user admin 165.22.221.82 port 33762 [preauth]
Feb 19 18:56:45 compute-0 sshd-session[31621]: Received disconnect from 91.224.92.78 port 13562:11:  [preauth]
Feb 19 18:56:45 compute-0 sshd-session[31621]: Disconnected from authenticating user root 91.224.92.78 port 13562 [preauth]
Feb 19 18:56:57 compute-0 sshd-session[31623]: Accepted publickey for zuul from 192.168.122.30 port 34274 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 18:56:57 compute-0 systemd-logind[822]: New session 8 of user zuul.
Feb 19 18:56:57 compute-0 systemd[1]: Started Session 8 of User zuul.
Feb 19 18:56:57 compute-0 sshd-session[31623]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 18:56:58 compute-0 python3.9[31776]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 18:56:59 compute-0 sudo[31955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjtwlcntfvuhzxnvbijotxmzoiedfiux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527419.4178119-39-46713193838420/AnsiballZ_command.py'
Feb 19 18:56:59 compute-0 sudo[31955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:56:59 compute-0 python3.9[31958]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:57:06 compute-0 sudo[31955]: pam_unix(sudo:session): session closed for user root
Feb 19 18:57:06 compute-0 sshd-session[31626]: Connection closed by 192.168.122.30 port 34274
Feb 19 18:57:06 compute-0 sshd-session[31623]: pam_unix(sshd:session): session closed for user zuul
Feb 19 18:57:06 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Feb 19 18:57:06 compute-0 systemd[1]: session-8.scope: Consumed 7.018s CPU time.
Feb 19 18:57:06 compute-0 systemd-logind[822]: Session 8 logged out. Waiting for processes to exit.
Feb 19 18:57:06 compute-0 systemd-logind[822]: Removed session 8.
Feb 19 18:57:12 compute-0 sshd-session[32015]: Accepted publickey for zuul from 192.168.122.30 port 45480 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 18:57:12 compute-0 systemd-logind[822]: New session 9 of user zuul.
Feb 19 18:57:12 compute-0 systemd[1]: Started Session 9 of User zuul.
Feb 19 18:57:12 compute-0 sshd-session[32015]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 18:57:13 compute-0 python3.9[32168]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 18:57:13 compute-0 sshd-session[32018]: Connection closed by 192.168.122.30 port 45480
Feb 19 18:57:13 compute-0 sshd-session[32015]: pam_unix(sshd:session): session closed for user zuul
Feb 19 18:57:13 compute-0 systemd-logind[822]: Session 9 logged out. Waiting for processes to exit.
Feb 19 18:57:13 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Feb 19 18:57:13 compute-0 systemd-logind[822]: Removed session 9.
Feb 19 18:57:16 compute-0 sshd-session[32196]: Invalid user admin from 165.22.221.82 port 53064
Feb 19 18:57:17 compute-0 sshd-session[32196]: Connection closed by invalid user admin 165.22.221.82 port 53064 [preauth]
Feb 19 18:57:29 compute-0 sshd-session[32198]: Accepted publickey for zuul from 192.168.122.30 port 51528 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 18:57:29 compute-0 systemd-logind[822]: New session 10 of user zuul.
Feb 19 18:57:29 compute-0 systemd[1]: Started Session 10 of User zuul.
Feb 19 18:57:29 compute-0 sshd-session[32198]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 18:57:30 compute-0 python3.9[32351]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 19 18:57:31 compute-0 python3.9[32525]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 18:57:32 compute-0 sudo[32675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tetwynwmdqwdbnxrtgulqyxrosraotvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527451.7114215-64-121133251790247/AnsiballZ_command.py'
Feb 19 18:57:32 compute-0 sudo[32675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:57:32 compute-0 python3.9[32678]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:57:32 compute-0 sudo[32675]: pam_unix(sudo:session): session closed for user root
Feb 19 18:57:32 compute-0 sudo[32829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hikzbnhhgqortvplmulltxxmvcvkmkjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527452.562195-88-200182179721758/AnsiballZ_stat.py'
Feb 19 18:57:32 compute-0 sudo[32829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:57:33 compute-0 python3.9[32832]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 18:57:33 compute-0 sudo[32829]: pam_unix(sudo:session): session closed for user root
Feb 19 18:57:33 compute-0 sudo[32982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsrnxjpaliprocpllvtdbfnqdxlphwfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527453.31574-104-19413019822474/AnsiballZ_file.py'
Feb 19 18:57:33 compute-0 sudo[32982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:57:33 compute-0 python3.9[32985]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:57:33 compute-0 sudo[32982]: pam_unix(sudo:session): session closed for user root
Feb 19 18:57:34 compute-0 sudo[33135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-butkhchnxyeyobdywfpgcpjhqszhssrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527454.0539155-120-110285234768752/AnsiballZ_stat.py'
Feb 19 18:57:34 compute-0 sudo[33135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:57:34 compute-0 python3.9[33138]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 18:57:34 compute-0 sudo[33135]: pam_unix(sudo:session): session closed for user root
Feb 19 18:57:34 compute-0 sudo[33259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkdubhznxmvlfvkjzhfavihrehxcrsht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527454.0539155-120-110285234768752/AnsiballZ_copy.py'
Feb 19 18:57:34 compute-0 sudo[33259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:57:35 compute-0 python3.9[33262]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771527454.0539155-120-110285234768752/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:57:35 compute-0 sudo[33259]: pam_unix(sudo:session): session closed for user root
Feb 19 18:57:35 compute-0 sudo[33412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilqidshxppmwgeckwnmhliebtgogvhqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527455.198177-150-174927066864590/AnsiballZ_setup.py'
Feb 19 18:57:35 compute-0 sudo[33412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:57:35 compute-0 python3.9[33415]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 18:57:35 compute-0 sudo[33412]: pam_unix(sudo:session): session closed for user root
Feb 19 18:57:36 compute-0 sudo[33569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iizgbhcjjaxveinwfelwaroqiuhynsaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527456.1537282-166-198950245710624/AnsiballZ_file.py'
Feb 19 18:57:36 compute-0 sudo[33569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:57:36 compute-0 python3.9[33572]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 18:57:36 compute-0 sudo[33569]: pam_unix(sudo:session): session closed for user root
Feb 19 18:57:36 compute-0 sudo[33722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bddidchieymivjlhbuxlopqrglkxxduo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527456.7642348-184-42356542173480/AnsiballZ_file.py'
Feb 19 18:57:36 compute-0 sudo[33722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:57:37 compute-0 python3.9[33725]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 18:57:37 compute-0 sudo[33722]: pam_unix(sudo:session): session closed for user root
Feb 19 18:57:37 compute-0 python3.9[33875]: ansible-ansible.builtin.service_facts Invoked
Feb 19 18:57:41 compute-0 python3.9[34129]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:57:42 compute-0 python3.9[34279]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 18:57:43 compute-0 python3.9[34433]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 18:57:43 compute-0 sudo[34589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swivceuevnknyiwjqexhwnaggqzjaaxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527463.7593646-280-23158728790694/AnsiballZ_setup.py'
Feb 19 18:57:43 compute-0 sudo[34589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:57:44 compute-0 python3.9[34592]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 18:57:44 compute-0 sudo[34589]: pam_unix(sudo:session): session closed for user root
Feb 19 18:57:44 compute-0 sudo[34674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxtcjnpuczkfjbqjtjsdslqykkybzdhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527463.7593646-280-23158728790694/AnsiballZ_dnf.py'
Feb 19 18:57:44 compute-0 sudo[34674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:57:44 compute-0 python3.9[34677]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 18:57:46 compute-0 sshd-session[34715]: Connection closed by 103.213.244.180 port 50722 [preauth]
Feb 19 18:58:13 compute-0 sshd-session[34825]: Invalid user ubuntu from 43.166.137.151 port 44220
Feb 19 18:58:13 compute-0 sshd-session[34825]: Received disconnect from 43.166.137.151 port 44220:11: Bye Bye [preauth]
Feb 19 18:58:13 compute-0 sshd-session[34825]: Disconnected from invalid user ubuntu 43.166.137.151 port 44220 [preauth]
Feb 19 18:58:18 compute-0 sshd-session[34827]: Invalid user admin from 165.22.221.82 port 49236
Feb 19 18:58:18 compute-0 sshd-session[34827]: Connection closed by invalid user admin 165.22.221.82 port 49236 [preauth]
Feb 19 18:58:28 compute-0 systemd[1]: Reloading.
Feb 19 18:58:29 compute-0 systemd-rc-local-generator[34876]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 18:58:29 compute-0 systemd[1]: Starting dnf makecache...
Feb 19 18:58:29 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Feb 19 18:58:29 compute-0 dnf[34899]: Repository 'gating-repo' is missing name in configuration, using id.
Feb 19 18:58:29 compute-0 dnf[34899]: Failed determining last makecache time.
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-openstack-barbican-42b4c41831408a8e323 123 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-python-glean-642fffe0203a8ffcc2443db52 151 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-openstack-cinder-1c00d6490d88e436f26ef 153 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 systemd[1]: Reloading.
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-python-stevedore-c4acc5639fd2329372142 142 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 systemd-rc-local-generator[34936]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-python-cloudkitty-tests-tempest-783703 126 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-diskimage-builder-61b717cc45660834fe9a 148 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-openstack-nova-eaa65f0b85123a4ee343246 155 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-python-designate-tests-tempest-347fdbc 169 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-openstack-glance-1fd12c29b339f30fe823e 174 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 149 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-openstack-manila-d783d10e75495b73866db 154 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-openstack-neutron-95cadbd379667c8520c8 142 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-openstack-octavia-5975097dd4b021385178 159 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 systemd[1]: Reloading.
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-openstack-watcher-c014f81a8647287f6dcc 161 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-python-tcib-78032d201b02cee27e8e644c61 159 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 132 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 systemd-rc-local-generator[34991]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-openstack-swift-dc98a8463506ac520c469a 155 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-python-tempestconf-8515371b7cceebd4282 149 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 dnf[34899]: delorean-openstack-heat-ui-013accbfd179753bc3f0 148 kB/s | 3.0 kB     00:00
Feb 19 18:58:29 compute-0 dnf[34899]: gating-repo                                     277 kB/s | 1.5 kB     00:00
Feb 19 18:58:29 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Feb 19 18:58:29 compute-0 dnf[34899]: CentOS Stream 9 - BaseOS                         68 kB/s | 7.0 kB     00:00
Feb 19 18:58:29 compute-0 dbus-broker-launch[794]: Noticed file-system modification, trigger reload.
Feb 19 18:58:29 compute-0 dbus-broker-launch[794]: Noticed file-system modification, trigger reload.
Feb 19 18:58:30 compute-0 dnf[34899]: CentOS Stream 9 - AppStream                      58 kB/s | 7.1 kB     00:00
Feb 19 18:58:30 compute-0 dnf[34899]: CentOS Stream 9 - CRB                            29 kB/s | 6.9 kB     00:00
Feb 19 18:58:30 compute-0 dnf[34899]: CentOS Stream 9 - Extras packages                32 kB/s | 7.6 kB     00:00
Feb 19 18:58:30 compute-0 dnf[34899]: dlrn-antelope-testing                            91 kB/s | 3.0 kB     00:00
Feb 19 18:58:30 compute-0 dnf[34899]: dlrn-antelope-build-deps                        171 kB/s | 3.0 kB     00:00
Feb 19 18:58:30 compute-0 dnf[34899]: centos9-rabbitmq                                118 kB/s | 3.0 kB     00:00
Feb 19 18:58:30 compute-0 dnf[34899]: centos9-storage                                 135 kB/s | 3.0 kB     00:00
Feb 19 18:58:30 compute-0 dnf[34899]: centos9-opstools                                139 kB/s | 3.0 kB     00:00
Feb 19 18:58:30 compute-0 dnf[34899]: NFV SIG OpenvSwitch                             126 kB/s | 3.0 kB     00:00
Feb 19 18:58:30 compute-0 dnf[34899]: repo-setup-centos-appstream                     188 kB/s | 4.4 kB     00:00
Feb 19 18:58:30 compute-0 dnf[34899]: repo-setup-centos-baseos                        170 kB/s | 3.9 kB     00:00
Feb 19 18:58:31 compute-0 dnf[34899]: repo-setup-centos-highavailability              167 kB/s | 3.9 kB     00:00
Feb 19 18:58:31 compute-0 dnf[34899]: repo-setup-centos-powertools                    198 kB/s | 4.3 kB     00:00
Feb 19 18:58:31 compute-0 dnf[34899]: Extra Packages for Enterprise Linux 9 - x86_64  227 kB/s |  30 kB     00:00
Feb 19 18:58:31 compute-0 dnf[34899]: Metadata cache created.
Feb 19 18:58:31 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 19 18:58:31 compute-0 systemd[1]: Finished dnf makecache.
Feb 19 18:58:31 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.602s CPU time.
Feb 19 18:58:36 compute-0 sshd-session[35066]: Invalid user ubuntu from 27.50.25.190 port 50454
Feb 19 18:58:36 compute-0 sshd-session[35066]: Received disconnect from 27.50.25.190 port 50454:11: Bye Bye [preauth]
Feb 19 18:58:36 compute-0 sshd-session[35066]: Disconnected from invalid user ubuntu 27.50.25.190 port 50454 [preauth]
Feb 19 18:59:20 compute-0 sshd-session[35235]: Invalid user admin from 165.22.221.82 port 47208
Feb 19 18:59:20 compute-0 sshd-session[35235]: Connection closed by invalid user admin 165.22.221.82 port 47208 [preauth]
Feb 19 18:59:24 compute-0 kernel: SELinux:  Converting 2727 SID table entries...
Feb 19 18:59:24 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 19 18:59:24 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 19 18:59:24 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 19 18:59:24 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 19 18:59:24 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 19 18:59:24 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 19 18:59:24 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 19 18:59:24 compute-0 dbus-broker-launch[804]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Feb 19 18:59:24 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 19 18:59:24 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 19 18:59:24 compute-0 systemd[1]: Reloading.
Feb 19 18:59:24 compute-0 systemd-rc-local-generator[35344]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 18:59:24 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 19 18:59:25 compute-0 sudo[34674]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:25 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 19 18:59:25 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 19 18:59:25 compute-0 systemd[1]: run-r12bec953181c433fbd238833cda0817b.service: Deactivated successfully.
Feb 19 18:59:29 compute-0 sshd-session[36159]: Invalid user sftptest from 138.255.157.62 port 33985
Feb 19 18:59:29 compute-0 sshd-session[36159]: Received disconnect from 138.255.157.62 port 33985:11: Bye Bye [preauth]
Feb 19 18:59:29 compute-0 sshd-session[36159]: Disconnected from invalid user sftptest 138.255.157.62 port 33985 [preauth]
Feb 19 18:59:40 compute-0 sudo[36286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rponxsbcqkajnedntrjrgayvblfinmsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527579.8142688-304-121776549718950/AnsiballZ_command.py'
Feb 19 18:59:40 compute-0 sudo[36286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:40 compute-0 python3.9[36289]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:59:41 compute-0 sudo[36286]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:41 compute-0 sudo[36568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eizbqsskqiprlfsooitgwvkffpmykexk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527581.3775425-320-123114712213491/AnsiballZ_selinux.py'
Feb 19 18:59:41 compute-0 sudo[36568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:42 compute-0 python3.9[36571]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb 19 18:59:42 compute-0 sudo[36568]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:42 compute-0 sudo[36721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utcyydyqexrgxjfscmsdwosmrthilmyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527582.5307407-342-48978158553942/AnsiballZ_command.py'
Feb 19 18:59:42 compute-0 sudo[36721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:42 compute-0 python3.9[36724]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb 19 18:59:43 compute-0 sudo[36721]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:45 compute-0 sudo[36875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziznjsuubgxxqjssuttgkqhbbwxzmxao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527584.843573-358-30234344844036/AnsiballZ_file.py'
Feb 19 18:59:45 compute-0 sudo[36875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:45 compute-0 python3.9[36878]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:59:45 compute-0 sudo[36875]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:45 compute-0 sudo[37028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adpjmqzkiolxhntydaglmsekjqzovrnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527585.4779325-374-138555038428445/AnsiballZ_mount.py'
Feb 19 18:59:45 compute-0 sudo[37028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:46 compute-0 python3.9[37031]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb 19 18:59:46 compute-0 sudo[37028]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:47 compute-0 sudo[37182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfubfoeljxrlesuegexvlfbmanpbolnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527587.0854204-430-92271412604875/AnsiballZ_file.py'
Feb 19 18:59:47 compute-0 sudo[37182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:48 compute-0 python3.9[37185]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 18:59:48 compute-0 sudo[37182]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:49 compute-0 sudo[37335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxiicagahfoyzqtrnuvmrdheyxfopnsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527589.0794342-446-146590489857260/AnsiballZ_stat.py'
Feb 19 18:59:49 compute-0 sudo[37335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:49 compute-0 python3.9[37338]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 18:59:49 compute-0 sudo[37335]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:49 compute-0 sudo[37459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkikmealtjrvvazidaxcilunghishpup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527589.0794342-446-146590489857260/AnsiballZ_copy.py'
Feb 19 18:59:49 compute-0 sudo[37459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:50 compute-0 python3.9[37462]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527589.0794342-446-146590489857260/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d5e185114d46c25488692b53a20c24df98df9ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:59:50 compute-0 sudo[37459]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:53 compute-0 sudo[37614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aajtukrdbmbetydxbmpyxipsuzvvjzgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527593.3092113-494-42227824084456/AnsiballZ_stat.py'
Feb 19 18:59:53 compute-0 sudo[37614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:53 compute-0 python3.9[37617]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 18:59:53 compute-0 sudo[37614]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:54 compute-0 sudo[37767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klgspxudalsjflntmyvlupanxqewcbcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527593.990121-510-236728340075975/AnsiballZ_command.py'
Feb 19 18:59:54 compute-0 sudo[37767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:54 compute-0 python3.9[37770]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 18:59:54 compute-0 sudo[37767]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:54 compute-0 sshd-session[37487]: Received disconnect from 182.75.216.74 port 35959:11: Bye Bye [preauth]
Feb 19 18:59:54 compute-0 sshd-session[37487]: Disconnected from authenticating user root 182.75.216.74 port 35959 [preauth]
Feb 19 18:59:54 compute-0 sudo[37921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysskiqcsxhycvzzlcoukzgyexygazlay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527594.6379995-526-138215524793402/AnsiballZ_file.py'
Feb 19 18:59:54 compute-0 sudo[37921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:55 compute-0 python3.9[37924]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 18:59:55 compute-0 sudo[37921]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:56 compute-0 sudo[38074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkjqvmhslzexftmleogawqnvvrvhyqzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527595.4477444-548-2293823136923/AnsiballZ_getent.py'
Feb 19 18:59:56 compute-0 sudo[38074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:56 compute-0 python3.9[38077]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb 19 18:59:56 compute-0 sudo[38074]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:56 compute-0 rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 18:59:56 compute-0 rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 18:59:56 compute-0 sudo[38229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbskmqkilkozzbhevxwwdtffutltgpnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527596.4796267-564-22134467737736/AnsiballZ_group.py'
Feb 19 18:59:56 compute-0 sudo[38229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:57 compute-0 python3.9[38232]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 19 18:59:57 compute-0 groupadd[38233]: group added to /etc/group: name=qemu, GID=107
Feb 19 18:59:57 compute-0 groupadd[38233]: group added to /etc/gshadow: name=qemu
Feb 19 18:59:57 compute-0 groupadd[38233]: new group: name=qemu, GID=107
Feb 19 18:59:57 compute-0 sudo[38229]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:57 compute-0 sudo[38388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdmazequgyzlnqukygtjebjxsiwzbivc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527597.3363614-580-157377430325479/AnsiballZ_user.py'
Feb 19 18:59:57 compute-0 sudo[38388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:58 compute-0 python3.9[38391]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 19 18:59:58 compute-0 useradd[38393]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/1
Feb 19 18:59:58 compute-0 sudo[38388]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:58 compute-0 sudo[38549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnsxrftzwrsemzmlfbqntyxzweoacjmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527598.4974296-596-61429063661321/AnsiballZ_getent.py'
Feb 19 18:59:58 compute-0 sudo[38549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:58 compute-0 python3.9[38552]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb 19 18:59:58 compute-0 sudo[38549]: pam_unix(sudo:session): session closed for user root
Feb 19 18:59:59 compute-0 sudo[38703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geaocncdwcjqyqtvoplldiervmsyykdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527599.1905553-612-58736086885218/AnsiballZ_group.py'
Feb 19 18:59:59 compute-0 sudo[38703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 18:59:59 compute-0 python3.9[38706]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 19 18:59:59 compute-0 groupadd[38707]: group added to /etc/group: name=hugetlbfs, GID=42477
Feb 19 18:59:59 compute-0 groupadd[38707]: group added to /etc/gshadow: name=hugetlbfs
Feb 19 18:59:59 compute-0 groupadd[38707]: new group: name=hugetlbfs, GID=42477
Feb 19 18:59:59 compute-0 sudo[38703]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:00 compute-0 sudo[38862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nogmtjnrnsvtgflkbmjvridvkrcotqff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527599.9057539-630-68943865431049/AnsiballZ_file.py'
Feb 19 19:00:00 compute-0 sudo[38862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:00 compute-0 python3.9[38865]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb 19 19:00:00 compute-0 sudo[38862]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:00 compute-0 sudo[39015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsjyfbqbcbijnarrarzsekmxpsacmqee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527600.7452078-652-110990579443252/AnsiballZ_dnf.py'
Feb 19 19:00:00 compute-0 sudo[39015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:01 compute-0 python3.9[39018]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:00:05 compute-0 sudo[39015]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:06 compute-0 sudo[39169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drlxquwvjuxlvanxubfjqsunirhxacgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527605.9054904-668-192533087177309/AnsiballZ_file.py'
Feb 19 19:00:06 compute-0 sudo[39169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:06 compute-0 python3.9[39172]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:00:06 compute-0 sudo[39169]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:06 compute-0 sudo[39322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utarwhnbbhkrxgftpdjqamnbudcazacv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527606.5076354-684-273981749718130/AnsiballZ_stat.py'
Feb 19 19:00:06 compute-0 sudo[39322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:06 compute-0 python3.9[39325]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:00:06 compute-0 sudo[39322]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:07 compute-0 sudo[39446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzivfmotfvbpkvhucmebokudgshbpdrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527606.5076354-684-273981749718130/AnsiballZ_copy.py'
Feb 19 19:00:07 compute-0 sudo[39446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:07 compute-0 python3.9[39449]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771527606.5076354-684-273981749718130/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:00:07 compute-0 sudo[39446]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:08 compute-0 sudo[39599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puxhrsgnjraartitzvkpbhzdkzeveefh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527607.5825584-714-101465448564544/AnsiballZ_systemd.py'
Feb 19 19:00:08 compute-0 sudo[39599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:08 compute-0 python3.9[39602]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:00:08 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 19 19:00:08 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Feb 19 19:00:08 compute-0 kernel: Bridge firewalling registered
Feb 19 19:00:08 compute-0 systemd-modules-load[39606]: Inserted module 'br_netfilter'
Feb 19 19:00:08 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 19 19:00:08 compute-0 sudo[39599]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:09 compute-0 sudo[39760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qacncfdzeouedveftcoyfjbyguhlswbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527608.8121848-730-73397407366805/AnsiballZ_stat.py'
Feb 19 19:00:09 compute-0 sudo[39760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:09 compute-0 python3.9[39763]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:00:09 compute-0 sudo[39760]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:09 compute-0 sudo[39884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdgsxisdzsnsoszvuhkcvidtydrtysko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527608.8121848-730-73397407366805/AnsiballZ_copy.py'
Feb 19 19:00:09 compute-0 sudo[39884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:09 compute-0 python3.9[39887]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771527608.8121848-730-73397407366805/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:00:09 compute-0 sudo[39884]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:10 compute-0 sudo[40037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmrslkxxgmqxcnaxtoyilunpyvckzhcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527610.1413772-766-233207469338932/AnsiballZ_dnf.py'
Feb 19 19:00:10 compute-0 sudo[40037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:10 compute-0 python3.9[40040]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:00:13 compute-0 dbus-broker-launch[794]: Noticed file-system modification, trigger reload.
Feb 19 19:00:13 compute-0 dbus-broker-launch[794]: Noticed file-system modification, trigger reload.
Feb 19 19:00:13 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 19 19:00:13 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 19 19:00:13 compute-0 systemd[1]: Reloading.
Feb 19 19:00:14 compute-0 systemd-rc-local-generator[40100]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:00:14 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 19 19:00:14 compute-0 sudo[40037]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:16 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 19 19:00:16 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 19 19:00:16 compute-0 systemd[1]: man-db-cache-update.service: Consumed 2.854s CPU time.
Feb 19 19:00:16 compute-0 systemd[1]: run-re5aa5ce877764ac3991997b5d96ef3b2.service: Deactivated successfully.
Feb 19 19:00:16 compute-0 python3.9[43837]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:00:17 compute-0 python3.9[43989]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 19 19:00:18 compute-0 python3.9[44139]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:00:18 compute-0 sudo[44289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orspyzyyqgvbhewigshxsawqhpcdsuqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527618.6440158-844-54716056328189/AnsiballZ_command.py'
Feb 19 19:00:18 compute-0 sudo[44289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:18 compute-0 python3.9[44292]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:00:19 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 19 19:00:19 compute-0 systemd[1]: Starting Authorization Manager...
Feb 19 19:00:19 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 19 19:00:19 compute-0 polkitd[44509]: Started polkitd version 0.117
Feb 19 19:00:19 compute-0 polkitd[44509]: Loading rules from directory /etc/polkit-1/rules.d
Feb 19 19:00:19 compute-0 polkitd[44509]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 19 19:00:19 compute-0 polkitd[44509]: Finished loading, compiling and executing 2 rules
Feb 19 19:00:19 compute-0 systemd[1]: Started Authorization Manager.
Feb 19 19:00:19 compute-0 polkitd[44509]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Feb 19 19:00:19 compute-0 sudo[44289]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:20 compute-0 sudo[44677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcrwfqtveoersdrmpyembolfsoplqtvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527620.0075364-862-155005739126855/AnsiballZ_systemd.py'
Feb 19 19:00:20 compute-0 sudo[44677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:20 compute-0 python3.9[44680]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:00:20 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 19 19:00:20 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Feb 19 19:00:20 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 19 19:00:20 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 19 19:00:20 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 19 19:00:20 compute-0 sudo[44677]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:21 compute-0 python3.9[44842]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 19 19:00:23 compute-0 sudo[44992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulawfbfldzthmaklgtzkmxsnohjcrpxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527623.5603118-976-182454823124526/AnsiballZ_systemd.py'
Feb 19 19:00:23 compute-0 sudo[44992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:24 compute-0 python3.9[44995]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:00:24 compute-0 systemd[1]: Reloading.
Feb 19 19:00:24 compute-0 systemd-rc-local-generator[45020]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:00:24 compute-0 sudo[44992]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:24 compute-0 sudo[45189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvlysjifmqrcvullzvngjajedmtohmyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527624.3595853-976-111060259888740/AnsiballZ_systemd.py'
Feb 19 19:00:24 compute-0 sudo[45189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:24 compute-0 python3.9[45192]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:00:24 compute-0 systemd[1]: Reloading.
Feb 19 19:00:24 compute-0 systemd-rc-local-generator[45217]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:00:25 compute-0 sudo[45189]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:25 compute-0 sudo[45385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnvdtdjlupksispqhqldhorbcaijysod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527625.4058995-1008-98531981921173/AnsiballZ_command.py'
Feb 19 19:00:25 compute-0 sudo[45385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:25 compute-0 python3.9[45388]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:00:25 compute-0 sudo[45385]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:26 compute-0 sudo[45539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icuyuifjvvleynsytusxmiarecazihab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527625.976745-1024-218075859925210/AnsiballZ_command.py'
Feb 19 19:00:26 compute-0 sudo[45539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:26 compute-0 python3.9[45542]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:00:26 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Feb 19 19:00:26 compute-0 sudo[45539]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:26 compute-0 sudo[45693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wajpjtwtcelgoaveoqrtprfsyeteyriu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527626.4783726-1040-31249945091702/AnsiballZ_command.py'
Feb 19 19:00:26 compute-0 sudo[45693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:26 compute-0 python3.9[45696]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:00:28 compute-0 sudo[45693]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:28 compute-0 sudo[45856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqrzhdjexakmnyqhfyhnqdmdjfpcnoup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527628.3321435-1056-265238022373155/AnsiballZ_command.py'
Feb 19 19:00:28 compute-0 sudo[45856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:28 compute-0 python3.9[45859]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:00:29 compute-0 sudo[45856]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:30 compute-0 sudo[46010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekgvamerqntorzfvwlqlldjkfndthkiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527629.8059602-1072-41753590612604/AnsiballZ_systemd.py'
Feb 19 19:00:30 compute-0 sudo[46010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:30 compute-0 python3.9[46013]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:00:30 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 19 19:00:30 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Feb 19 19:00:30 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Feb 19 19:00:30 compute-0 systemd[1]: Starting Apply Kernel Variables...
Feb 19 19:00:30 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 19 19:00:30 compute-0 systemd[1]: Finished Apply Kernel Variables.
Feb 19 19:00:30 compute-0 sudo[46010]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:30 compute-0 sshd-session[32201]: Connection closed by 192.168.122.30 port 51528
Feb 19 19:00:30 compute-0 sshd-session[32198]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:00:30 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Feb 19 19:00:30 compute-0 systemd[1]: session-10.scope: Consumed 1min 57.036s CPU time.
Feb 19 19:00:30 compute-0 systemd-logind[822]: Session 10 logged out. Waiting for processes to exit.
Feb 19 19:00:30 compute-0 systemd-logind[822]: Removed session 10.
Feb 19 19:00:36 compute-0 sshd-session[46044]: Accepted publickey for zuul from 192.168.122.30 port 60834 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:00:36 compute-0 systemd-logind[822]: New session 11 of user zuul.
Feb 19 19:00:36 compute-0 systemd[1]: Started Session 11 of User zuul.
Feb 19 19:00:36 compute-0 sshd-session[46044]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:00:37 compute-0 python3.9[46197]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:00:38 compute-0 python3.9[46351]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:00:39 compute-0 sudo[46505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rinbiroymbkvybysneflpokdsevqlvcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527639.2232914-75-74136490866631/AnsiballZ_command.py'
Feb 19 19:00:39 compute-0 sudo[46505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:39 compute-0 python3.9[46508]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:00:39 compute-0 sudo[46505]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:40 compute-0 python3.9[46659]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:00:41 compute-0 sudo[46813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkvmcbzbodfxoxusfjqemfvhbwplhmpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527641.1507368-115-26566820096441/AnsiballZ_setup.py'
Feb 19 19:00:41 compute-0 sudo[46813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:41 compute-0 python3.9[46816]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 19:00:41 compute-0 sudo[46813]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:42 compute-0 sudo[46898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehkweycokflimalkyllchieznzzhsobn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527641.1507368-115-26566820096441/AnsiballZ_dnf.py'
Feb 19 19:00:42 compute-0 sudo[46898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:42 compute-0 python3.9[46901]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:00:43 compute-0 sudo[46898]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:43 compute-0 sudo[47052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elesobwfqtjqtorolckftjzgobbcmhoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527643.769759-139-144442858246290/AnsiballZ_setup.py'
Feb 19 19:00:43 compute-0 sudo[47052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:44 compute-0 python3.9[47055]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 19:00:44 compute-0 sudo[47052]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:45 compute-0 sudo[47224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfvhpyyrgckarrknqrtopoqttkuitqgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527644.7365825-161-159541464387330/AnsiballZ_file.py'
Feb 19 19:00:45 compute-0 sudo[47224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:45 compute-0 python3.9[47227]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:00:45 compute-0 sudo[47224]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:45 compute-0 sudo[47377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khcbofcllwarommyugikwabqwfjeqxae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527645.4644759-177-113089304617793/AnsiballZ_command.py'
Feb 19 19:00:45 compute-0 sudo[47377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:45 compute-0 python3.9[47380]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:00:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck863002671-merged.mount: Deactivated successfully.
Feb 19 19:00:45 compute-0 podman[47381]: 2026-02-19 19:00:45.866204119 +0000 UTC m=+0.036582318 system refresh
Feb 19 19:00:45 compute-0 sudo[47377]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:46 compute-0 sudo[47541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oidxsvoiydpetlcutnywbmxvpsresajq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527646.0872383-193-35184972020472/AnsiballZ_stat.py'
Feb 19 19:00:46 compute-0 sudo[47541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:46 compute-0 python3.9[47544]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:00:46 compute-0 sudo[47541]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:00:47 compute-0 sudo[47665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkayrigyjtfvhgrwhzxvlnjahjqnvyar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527646.0872383-193-35184972020472/AnsiballZ_copy.py'
Feb 19 19:00:47 compute-0 sudo[47665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:47 compute-0 python3.9[47668]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527646.0872383-193-35184972020472/.source.json follow=False _original_basename=podman_network_config.j2 checksum=94dd86f5bfe08ddf34d7d24693cb405f77f96fdf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:00:47 compute-0 sudo[47665]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:47 compute-0 sudo[47818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shrabrtrazdstojgmxheyvkfuehjtjkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527647.4210467-223-16101665850213/AnsiballZ_stat.py'
Feb 19 19:00:47 compute-0 sudo[47818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:47 compute-0 python3.9[47821]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:00:47 compute-0 sudo[47818]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:48 compute-0 sudo[47942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqylamvobwqmyrksgslfeukojuslqlgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527647.4210467-223-16101665850213/AnsiballZ_copy.py'
Feb 19 19:00:48 compute-0 sudo[47942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:48 compute-0 python3.9[47945]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771527647.4210467-223-16101665850213/.source.conf follow=False _original_basename=registries.conf.j2 checksum=764cd3bd5ac6fe36afc781649820d9d50556d508 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:00:48 compute-0 sudo[47942]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:48 compute-0 sudo[48095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acdhojcpdcoakxdqgojdscactulkvyfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527648.5723765-255-170150119583265/AnsiballZ_ini_file.py'
Feb 19 19:00:48 compute-0 sudo[48095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:49 compute-0 python3.9[48098]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:00:49 compute-0 sudo[48095]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:49 compute-0 sudo[48248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxirmsyzpplfuwdqavfdpjkxbsjvecku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527649.1802447-255-43978265760575/AnsiballZ_ini_file.py'
Feb 19 19:00:49 compute-0 sudo[48248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:49 compute-0 python3.9[48251]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:00:49 compute-0 sudo[48248]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:49 compute-0 sudo[48401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ridyiyjofoqyieqenpruupbdifzcydyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527649.6668713-255-68769202684292/AnsiballZ_ini_file.py'
Feb 19 19:00:49 compute-0 sudo[48401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:50 compute-0 python3.9[48404]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:00:50 compute-0 sudo[48401]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:50 compute-0 sudo[48554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpjyqyivbnxnzerjuccewepjogocibcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527650.1201677-255-205157462389893/AnsiballZ_ini_file.py'
Feb 19 19:00:50 compute-0 sudo[48554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:50 compute-0 python3.9[48557]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:00:50 compute-0 sudo[48554]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:51 compute-0 python3.9[48707]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:00:51 compute-0 sudo[48859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtleazevmqvodrkbgismkbbzkasfncjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527651.5458703-335-59635370618636/AnsiballZ_dnf.py'
Feb 19 19:00:51 compute-0 sudo[48859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:52 compute-0 python3.9[48862]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 19 19:00:53 compute-0 sudo[48859]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:53 compute-0 sudo[49013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acnnhuxbewwplofmwxfbxucwpedkpnnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527653.347963-351-227202046629480/AnsiballZ_dnf.py'
Feb 19 19:00:53 compute-0 sudo[49013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:53 compute-0 python3.9[49016]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 19 19:00:55 compute-0 sudo[49013]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:56 compute-0 sudo[49175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azkthvdgjiyzhpiseapobikpridmuapx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527656.3159752-371-244437919814060/AnsiballZ_dnf.py'
Feb 19 19:00:56 compute-0 sudo[49175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:56 compute-0 python3.9[49178]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 19 19:00:57 compute-0 sudo[49175]: pam_unix(sudo:session): session closed for user root
Feb 19 19:00:58 compute-0 sudo[49329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikgimfklnaelmxoewyphuurqjidocmkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527658.228433-389-276945820793390/AnsiballZ_dnf.py'
Feb 19 19:00:58 compute-0 sudo[49329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:00:58 compute-0 python3.9[49332]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 19 19:00:59 compute-0 sudo[49329]: pam_unix(sudo:session): session closed for user root
Feb 19 19:01:00 compute-0 sudo[49483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsdiqaxpvmactawabsiilfsynqdrbxyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527660.293268-411-94461172743316/AnsiballZ_dnf.py'
Feb 19 19:01:00 compute-0 sudo[49483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:01:00 compute-0 python3.9[49486]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 19 19:01:01 compute-0 CROND[49489]: (root) CMD (run-parts /etc/cron.hourly)
Feb 19 19:01:01 compute-0 run-parts[49492]: (/etc/cron.hourly) starting 0anacron
Feb 19 19:01:01 compute-0 anacron[49500]: Anacron started on 2026-02-19
Feb 19 19:01:01 compute-0 anacron[49500]: Will run job `cron.daily' in 37 min.
Feb 19 19:01:01 compute-0 anacron[49500]: Will run job `cron.weekly' in 57 min.
Feb 19 19:01:01 compute-0 anacron[49500]: Will run job `cron.monthly' in 77 min.
Feb 19 19:01:01 compute-0 anacron[49500]: Jobs will be executed sequentially
Feb 19 19:01:01 compute-0 run-parts[49502]: (/etc/cron.hourly) finished 0anacron
Feb 19 19:01:01 compute-0 CROND[49488]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 19 19:01:02 compute-0 sudo[49483]: pam_unix(sudo:session): session closed for user root
Feb 19 19:01:02 compute-0 sudo[49655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tviihrkudevjxiodiksoktcbwfvpxsvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527662.4990177-427-101524408799062/AnsiballZ_dnf.py'
Feb 19 19:01:02 compute-0 sudo[49655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:01:02 compute-0 python3.9[49658]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 19 19:01:05 compute-0 sudo[49655]: pam_unix(sudo:session): session closed for user root
Feb 19 19:01:07 compute-0 sudo[49825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abpwxtxzsgokmhaxiosbiauckulskqso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527667.3263144-445-85849734099461/AnsiballZ_dnf.py'
Feb 19 19:01:07 compute-0 sudo[49825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:01:07 compute-0 python3.9[49828]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 19 19:01:09 compute-0 sudo[49825]: pam_unix(sudo:session): session closed for user root
Feb 19 19:01:09 compute-0 sudo[49979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cffuwqlfqybozpjpddrafiqejxwhqxsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527669.2155335-463-56239091849197/AnsiballZ_dnf.py'
Feb 19 19:01:09 compute-0 sudo[49979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:01:09 compute-0 python3.9[49982]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 19 19:01:13 compute-0 sshd-session[49994]: Invalid user mailuser from 43.166.137.151 port 47980
Feb 19 19:01:13 compute-0 sshd-session[49994]: Received disconnect from 43.166.137.151 port 47980:11: Bye Bye [preauth]
Feb 19 19:01:13 compute-0 sshd-session[49994]: Disconnected from invalid user mailuser 43.166.137.151 port 47980 [preauth]
Feb 19 19:01:29 compute-0 sudo[49979]: pam_unix(sudo:session): session closed for user root
Feb 19 19:01:30 compute-0 sudo[50318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buyzgyxivxxwamssckmbwxkubepphcru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527690.1255703-481-229004010477498/AnsiballZ_dnf.py'
Feb 19 19:01:30 compute-0 sudo[50318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:01:30 compute-0 python3.9[50321]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 19 19:01:31 compute-0 sudo[50318]: pam_unix(sudo:session): session closed for user root
Feb 19 19:01:32 compute-0 sudo[50475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyfquknknexsmmdpwyigxgilorhuqfbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527692.4233627-501-212761417980397/AnsiballZ_dnf.py'
Feb 19 19:01:32 compute-0 sudo[50475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:01:32 compute-0 python3.9[50478]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 19 19:01:34 compute-0 sudo[50475]: pam_unix(sudo:session): session closed for user root
Feb 19 19:01:35 compute-0 sudo[50633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwgbxejlkvfyeaddmosqbccymautptra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527695.1103117-523-194900412052044/AnsiballZ_file.py'
Feb 19 19:01:35 compute-0 sudo[50633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:01:35 compute-0 python3.9[50636]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:01:35 compute-0 sudo[50633]: pam_unix(sudo:session): session closed for user root
Feb 19 19:01:35 compute-0 sudo[50809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfkqxmhcxudkdlzctrsfovkqtqsrwtei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527695.7221751-539-142030340253683/AnsiballZ_stat.py'
Feb 19 19:01:35 compute-0 sudo[50809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:01:36 compute-0 python3.9[50812]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:01:36 compute-0 sudo[50809]: pam_unix(sudo:session): session closed for user root
Feb 19 19:01:36 compute-0 sudo[50933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yupjkagfnesdulyybwusbywohozqckds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527695.7221751-539-142030340253683/AnsiballZ_copy.py'
Feb 19 19:01:36 compute-0 sudo[50933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:01:36 compute-0 python3.9[50936]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1771527695.7221751-539-142030340253683/.source.json _original_basename=.d_g6kn_l follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:01:36 compute-0 sudo[50933]: pam_unix(sudo:session): session closed for user root
Feb 19 19:01:37 compute-0 sudo[51086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkabcqlebloyqsjguyvwqckyzoecmbfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527697.0309753-575-196469528064641/AnsiballZ_podman_image.py'
Feb 19 19:01:37 compute-0 sudo[51086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:01:37 compute-0 python3.9[51089]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 19 19:01:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:01:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3619134996-lower\x2dmapped.mount: Deactivated successfully.
Feb 19 19:01:44 compute-0 podman[51101]: 2026-02-19 19:01:44.412955478 +0000 UTC m=+6.585869597 image pull 5a85504e92eb83605c2458586ce4ee1b65cbcc1df72633eee3ac25690daabc9a 38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Feb 19 19:01:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:01:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:01:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:01:44 compute-0 sudo[51086]: pam_unix(sudo:session): session closed for user root
Feb 19 19:01:45 compute-0 sudo[51400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obfyvxqmpmejoyytxpelqqeoreytuaot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527704.9748337-597-177862354458683/AnsiballZ_podman_image.py'
Feb 19 19:01:45 compute-0 sudo[51400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:01:45 compute-0 python3.9[51403]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 19 19:01:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:01:58 compute-0 podman[51415]: 2026-02-19 19:01:58.183495037 +0000 UTC m=+12.570767921 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:01:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:01:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:01:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:01:58 compute-0 sudo[51400]: pam_unix(sudo:session): session closed for user root
Feb 19 19:01:58 compute-0 sudo[51729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfoulekvcoulboycknobbpynniyrtcpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527718.637921-617-34427607712297/AnsiballZ_podman_image.py'
Feb 19 19:01:58 compute-0 sudo[51729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:01:59 compute-0 python3.9[51732]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 19 19:02:12 compute-0 podman[51744]: 2026-02-19 19:02:12.457383775 +0000 UTC m=+13.402052494 image pull a2cf355e1328741433ea45a579c41828962820431e14bc44b297a8d036ff250d 38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Feb 19 19:02:12 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:02:12 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:02:12 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:02:12 compute-0 sudo[51729]: pam_unix(sudo:session): session closed for user root
Feb 19 19:02:14 compute-0 sudo[52032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upmdnsmmfwzbrutyhkpmixgtcuczyvow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527734.424225-639-69535260447688/AnsiballZ_podman_image.py'
Feb 19 19:02:14 compute-0 sudo[52032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:02:14 compute-0 python3.9[52035]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.75:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 19 19:02:14 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:02:16 compute-0 podman[52046]: 2026-02-19 19:02:16.409660198 +0000 UTC m=+1.487673404 image pull e78c96d75328e2ccc822df2bc74cc91219b31184b692c235f13423d5a17d299a 38.102.83.75:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest
Feb 19 19:02:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:02:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:02:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:02:16 compute-0 sudo[52032]: pam_unix(sudo:session): session closed for user root
Feb 19 19:02:16 compute-0 sudo[52299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qentqsghdirjdrgunefcifqewangkwxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527736.675234-639-132887059915134/AnsiballZ_podman_image.py'
Feb 19 19:02:16 compute-0 sudo[52299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:02:17 compute-0 python3.9[52302]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 19 19:02:18 compute-0 podman[52314]: 2026-02-19 19:02:18.178819971 +0000 UTC m=+1.059991158 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Feb 19 19:02:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:02:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:02:18 compute-0 sudo[52299]: pam_unix(sudo:session): session closed for user root
Feb 19 19:02:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:02:23 compute-0 sshd-session[46047]: Connection closed by 192.168.122.30 port 60834
Feb 19 19:02:23 compute-0 sshd-session[46044]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:02:23 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Feb 19 19:02:23 compute-0 systemd[1]: session-11.scope: Consumed 1min 32.013s CPU time.
Feb 19 19:02:23 compute-0 systemd-logind[822]: Session 11 logged out. Waiting for processes to exit.
Feb 19 19:02:23 compute-0 systemd-logind[822]: Removed session 11.
Feb 19 19:02:28 compute-0 sshd-session[52460]: Accepted publickey for zuul from 192.168.122.30 port 58732 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:02:28 compute-0 systemd-logind[822]: New session 12 of user zuul.
Feb 19 19:02:28 compute-0 systemd[1]: Started Session 12 of User zuul.
Feb 19 19:02:28 compute-0 sshd-session[52460]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:02:29 compute-0 python3.9[52613]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:02:30 compute-0 sudo[52767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvnrokjpsztinzfsxfzypnwxrpakyzlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527750.4809504-49-189099926429125/AnsiballZ_getent.py'
Feb 19 19:02:30 compute-0 sudo[52767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:02:30 compute-0 python3.9[52770]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb 19 19:02:31 compute-0 sudo[52767]: pam_unix(sudo:session): session closed for user root
Feb 19 19:02:31 compute-0 sudo[52921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvjprxxnrjxrepdabamyfrnjattjhiij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527751.1525245-65-8720117357702/AnsiballZ_group.py'
Feb 19 19:02:31 compute-0 sudo[52921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:02:31 compute-0 python3.9[52924]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 19 19:02:31 compute-0 groupadd[52925]: group added to /etc/group: name=openvswitch, GID=42476
Feb 19 19:02:31 compute-0 groupadd[52925]: group added to /etc/gshadow: name=openvswitch
Feb 19 19:02:31 compute-0 groupadd[52925]: new group: name=openvswitch, GID=42476
Feb 19 19:02:31 compute-0 sudo[52921]: pam_unix(sudo:session): session closed for user root
Feb 19 19:02:32 compute-0 sudo[53080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqebknlnygrlebqempsbtfjjfomuzamr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527751.9655857-81-119423204572606/AnsiballZ_user.py'
Feb 19 19:02:32 compute-0 sudo[53080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:02:32 compute-0 python3.9[53083]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 19 19:02:32 compute-0 useradd[53085]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/1
Feb 19 19:02:32 compute-0 useradd[53085]: add 'openvswitch' to group 'hugetlbfs'
Feb 19 19:02:32 compute-0 useradd[53085]: add 'openvswitch' to shadow group 'hugetlbfs'
Feb 19 19:02:32 compute-0 sudo[53080]: pam_unix(sudo:session): session closed for user root
Feb 19 19:02:33 compute-0 sudo[53241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykvvehklqilfgqgxffnioimjtiylivvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527753.330442-101-142133636264537/AnsiballZ_setup.py'
Feb 19 19:02:33 compute-0 sudo[53241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:02:33 compute-0 python3.9[53244]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 19:02:34 compute-0 sudo[53241]: pam_unix(sudo:session): session closed for user root
Feb 19 19:02:34 compute-0 sudo[53326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsymeqilxlznbttjbsyqcbcvkbeqwgfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527753.330442-101-142133636264537/AnsiballZ_dnf.py'
Feb 19 19:02:34 compute-0 sudo[53326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:02:34 compute-0 python3.9[53329]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 19 19:02:35 compute-0 sudo[53326]: pam_unix(sudo:session): session closed for user root
Feb 19 19:02:36 compute-0 sudo[53489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsbhrypgdfbvedeikdxohqbxygjfayax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527756.4725037-129-139822164942302/AnsiballZ_dnf.py'
Feb 19 19:02:36 compute-0 sudo[53489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:02:36 compute-0 python3.9[53492]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:02:38 compute-0 sshd-session[53494]: Invalid user teamspeak3 from 27.50.25.190 port 50156
Feb 19 19:02:38 compute-0 sshd-session[53494]: Received disconnect from 27.50.25.190 port 50156:11: Bye Bye [preauth]
Feb 19 19:02:38 compute-0 sshd-session[53494]: Disconnected from invalid user teamspeak3 27.50.25.190 port 50156 [preauth]
Feb 19 19:02:48 compute-0 kernel: SELinux:  Converting 2741 SID table entries...
Feb 19 19:02:48 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 19 19:02:48 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 19 19:02:48 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 19 19:02:48 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 19 19:02:48 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 19 19:02:48 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 19 19:02:48 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 19 19:02:48 compute-0 groupadd[53517]: group added to /etc/group: name=unbound, GID=994
Feb 19 19:02:48 compute-0 groupadd[53517]: group added to /etc/gshadow: name=unbound
Feb 19 19:02:48 compute-0 groupadd[53517]: new group: name=unbound, GID=994
Feb 19 19:02:48 compute-0 useradd[53524]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Feb 19 19:02:48 compute-0 dbus-broker-launch[804]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Feb 19 19:02:48 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Feb 19 19:02:49 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 19 19:02:49 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 19 19:02:49 compute-0 systemd[1]: Reloading.
Feb 19 19:02:50 compute-0 systemd-rc-local-generator[54022]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:02:50 compute-0 systemd-sysv-generator[54027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:02:50 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 19 19:02:50 compute-0 sudo[53489]: pam_unix(sudo:session): session closed for user root
Feb 19 19:02:50 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 19 19:02:50 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 19 19:02:50 compute-0 systemd[1]: run-r819fa877ceb54e9296795247cf2fdb27.service: Deactivated successfully.
Feb 19 19:02:51 compute-0 sudo[54615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwjjvmaqxbzmfoqvvsvrkahytfpidczn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527770.8831356-145-150759779229395/AnsiballZ_systemd.py'
Feb 19 19:02:51 compute-0 sudo[54615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:02:51 compute-0 python3.9[54618]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 19 19:02:51 compute-0 systemd[1]: Reloading.
Feb 19 19:02:51 compute-0 systemd-rc-local-generator[54650]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:02:51 compute-0 systemd-sysv-generator[54653]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:02:51 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Feb 19 19:02:51 compute-0 chown[54667]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Feb 19 19:02:52 compute-0 ovs-ctl[54672]: /etc/openvswitch/conf.db does not exist ... (warning).
Feb 19 19:02:52 compute-0 ovs-ctl[54672]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Feb 19 19:02:52 compute-0 ovs-ctl[54672]: Starting ovsdb-server [  OK  ]
Feb 19 19:02:52 compute-0 ovs-vsctl[54721]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Feb 19 19:02:52 compute-0 ovs-vsctl[54741]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"e8e72127-2f6b-43eb-b51a-e32006a33d3c\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Feb 19 19:02:52 compute-0 ovs-ctl[54672]: Configuring Open vSwitch system IDs [  OK  ]
Feb 19 19:02:52 compute-0 ovs-ctl[54672]: Enabling remote OVSDB managers [  OK  ]
Feb 19 19:02:52 compute-0 ovs-vsctl[54747]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 19 19:02:52 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Feb 19 19:02:52 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Feb 19 19:02:52 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Feb 19 19:02:52 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Feb 19 19:02:52 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Feb 19 19:02:52 compute-0 ovs-ctl[54791]: Inserting openvswitch module [  OK  ]
Feb 19 19:02:52 compute-0 ovs-ctl[54760]: Starting ovs-vswitchd [  OK  ]
Feb 19 19:02:52 compute-0 ovs-ctl[54760]: Enabling remote OVSDB managers [  OK  ]
Feb 19 19:02:52 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Feb 19 19:02:52 compute-0 ovs-vsctl[54809]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 19 19:02:52 compute-0 systemd[1]: Starting Open vSwitch...
Feb 19 19:02:52 compute-0 systemd[1]: Finished Open vSwitch.
Feb 19 19:02:52 compute-0 sudo[54615]: pam_unix(sudo:session): session closed for user root
Feb 19 19:02:53 compute-0 python3.9[54960]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:02:53 compute-0 sudo[55110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqivrpllbdxilwpdzlitmwxhfkfiyhcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527773.422266-183-86472455852698/AnsiballZ_sefcontext.py'
Feb 19 19:02:53 compute-0 sudo[55110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:02:54 compute-0 python3.9[55113]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb 19 19:02:55 compute-0 kernel: SELinux:  Converting 2755 SID table entries...
Feb 19 19:02:55 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 19 19:02:55 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 19 19:02:55 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 19 19:02:55 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 19 19:02:55 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 19 19:02:55 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 19 19:02:55 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 19 19:02:55 compute-0 sudo[55110]: pam_unix(sudo:session): session closed for user root
Feb 19 19:02:56 compute-0 python3.9[55268]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:02:56 compute-0 sudo[55424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojronilxzigmuqoqovnnlhkqwvheaakj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527776.7973797-219-104291408855481/AnsiballZ_dnf.py'
Feb 19 19:02:56 compute-0 dbus-broker-launch[804]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Feb 19 19:02:56 compute-0 sudo[55424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:02:57 compute-0 python3.9[55427]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:02:58 compute-0 sudo[55424]: pam_unix(sudo:session): session closed for user root
Feb 19 19:02:58 compute-0 sudo[55578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiungxioduqqooqzdcmimvcatjlptnmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527778.5907984-235-249230368743830/AnsiballZ_command.py'
Feb 19 19:02:58 compute-0 sudo[55578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:02:59 compute-0 python3.9[55581]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:02:59 compute-0 sudo[55578]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:00 compute-0 sudo[55866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voxmsyouesjymjqddbumgrtluvjoajol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527779.9379268-251-240560120080133/AnsiballZ_file.py'
Feb 19 19:03:00 compute-0 sudo[55866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:00 compute-0 python3.9[55869]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb 19 19:03:00 compute-0 sudo[55866]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:01 compute-0 python3.9[56021]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:03:01 compute-0 sshd-session[55946]: Invalid user ubuntu from 138.255.157.62 port 61246
Feb 19 19:03:01 compute-0 sudo[56173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efwocpwragqxghokzbxlnxkxhzkrnyoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527781.4326732-283-140212432813937/AnsiballZ_dnf.py'
Feb 19 19:03:01 compute-0 sudo[56173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:01 compute-0 sshd-session[55946]: Received disconnect from 138.255.157.62 port 61246:11: Bye Bye [preauth]
Feb 19 19:03:01 compute-0 sshd-session[55946]: Disconnected from invalid user ubuntu 138.255.157.62 port 61246 [preauth]
Feb 19 19:03:01 compute-0 python3.9[56176]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:03:03 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 19 19:03:03 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 19 19:03:03 compute-0 systemd[1]: Reloading.
Feb 19 19:03:03 compute-0 systemd-rc-local-generator[56215]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:03:03 compute-0 systemd-sysv-generator[56218]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:03:03 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 19 19:03:03 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 19 19:03:03 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 19 19:03:03 compute-0 systemd[1]: run-r189daa6a02ae47788e7f5b75f3b3d712.service: Deactivated successfully.
Feb 19 19:03:03 compute-0 sudo[56173]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:04 compute-0 sudo[56498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvsnechychoxeevhjwktxfgbompnhgbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527784.5693712-299-80431965251257/AnsiballZ_systemd.py'
Feb 19 19:03:04 compute-0 sudo[56498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:05 compute-0 python3.9[56501]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:03:05 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 19 19:03:05 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Feb 19 19:03:05 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Feb 19 19:03:05 compute-0 systemd[1]: Stopping Network Manager...
Feb 19 19:03:05 compute-0 NetworkManager[7684]: <info>  [1771527785.1267] caught SIGTERM, shutting down normally.
Feb 19 19:03:05 compute-0 NetworkManager[7684]: <info>  [1771527785.1278] dhcp4 (eth0): canceled DHCP transaction
Feb 19 19:03:05 compute-0 NetworkManager[7684]: <info>  [1771527785.1278] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 19 19:03:05 compute-0 NetworkManager[7684]: <info>  [1771527785.1278] dhcp4 (eth0): state changed no lease
Feb 19 19:03:05 compute-0 NetworkManager[7684]: <info>  [1771527785.1281] manager: NetworkManager state is now CONNECTED_SITE
Feb 19 19:03:05 compute-0 NetworkManager[7684]: <info>  [1771527785.1454] exiting (success)
Feb 19 19:03:05 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 19 19:03:05 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 19 19:03:05 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 19 19:03:05 compute-0 systemd[1]: Stopped Network Manager.
Feb 19 19:03:05 compute-0 systemd[1]: NetworkManager.service: Consumed 10.819s CPU time, 4.1M memory peak, read 0B from disk, written 16.0K to disk.
Feb 19 19:03:05 compute-0 systemd[1]: Starting Network Manager...
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.1884] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ff499d4a-6be1-401f-8b91-3dc243d525b1)
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.1885] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.1920] manager[0x55f70a602000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 19 19:03:05 compute-0 systemd[1]: Starting Hostname Service...
Feb 19 19:03:05 compute-0 systemd[1]: Started Hostname Service.
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2494] hostname: hostname: using hostnamed
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2494] hostname: static hostname changed from (none) to "compute-0"
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2501] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2506] manager[0x55f70a602000]: rfkill: Wi-Fi hardware radio set enabled
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2506] manager[0x55f70a602000]: rfkill: WWAN hardware radio set enabled
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2528] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2538] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2538] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2539] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2540] manager: Networking is enabled by state file
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2543] settings: Loaded settings plugin: keyfile (internal)
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2546] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2577] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2586] dhcp: init: Using DHCP client 'internal'
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2589] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2595] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2601] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2609] device (lo): Activation: starting connection 'lo' (4563c9fb-fd0a-4471-af90-84b9f0d73c08)
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2616] device (eth0): carrier: link connected
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2620] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2627] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2628] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2635] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2642] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2648] device (eth1): carrier: link connected
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2651] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2658] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (4e789d3f-d1df-5b4d-a218-9665bd6d0d0e) (indicated)
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2658] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2664] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2671] device (eth1): Activation: starting connection 'ci-private-network' (4e789d3f-d1df-5b4d-a218-9665bd6d0d0e)
Feb 19 19:03:05 compute-0 systemd[1]: Started Network Manager.
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2677] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2684] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2686] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2689] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2692] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2696] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2699] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2703] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2707] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2714] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2718] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2727] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2740] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2749] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2752] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2757] device (lo): Activation: successful, device activated.
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2766] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2769] dhcp4 (eth0): state changed new lease, address=38.102.83.219
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2771] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2774] manager: NetworkManager state is now CONNECTED_LOCAL
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2777] device (eth1): Activation: successful, device activated.
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2789] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 19 19:03:05 compute-0 systemd[1]: Starting Network Manager Wait Online...
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2849] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2866] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2868] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2872] manager: NetworkManager state is now CONNECTED_SITE
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2876] device (eth0): Activation: successful, device activated.
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2881] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 19 19:03:05 compute-0 NetworkManager[56519]: <info>  [1771527785.2883] manager: startup complete
Feb 19 19:03:05 compute-0 sudo[56498]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:05 compute-0 systemd[1]: Finished Network Manager Wait Online.
Feb 19 19:03:05 compute-0 sudo[56725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qawhktlysduwnuhetwnafajbswfsgsnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527785.6962678-315-92756056246834/AnsiballZ_dnf.py'
Feb 19 19:03:05 compute-0 sudo[56725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:06 compute-0 python3.9[56728]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:03:11 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 19 19:03:11 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 19 19:03:11 compute-0 systemd[1]: Reloading.
Feb 19 19:03:11 compute-0 systemd-rc-local-generator[56776]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:03:11 compute-0 systemd-sysv-generator[56779]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:03:11 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 19 19:03:12 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 19 19:03:12 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 19 19:03:12 compute-0 systemd[1]: run-r98c4c4cb47ea4365919b36d30efd3f18.service: Deactivated successfully.
Feb 19 19:03:12 compute-0 sudo[56725]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:13 compute-0 sudo[57204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjpwnrgbybydyuazdynakwdrtdcgublg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527792.8570468-339-69554866879026/AnsiballZ_stat.py'
Feb 19 19:03:13 compute-0 sudo[57204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:13 compute-0 python3.9[57207]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:03:13 compute-0 sudo[57204]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:13 compute-0 sudo[57357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gofahinjcoegvotjbbbsfqkmdsmvfzkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527793.4454129-357-35341982013209/AnsiballZ_ini_file.py'
Feb 19 19:03:13 compute-0 sudo[57357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:13 compute-0 python3.9[57360]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:03:13 compute-0 sudo[57357]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:14 compute-0 sudo[57512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndlzngicuygctucligwoggzrsbyyapzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527794.3039575-377-172960074152862/AnsiballZ_ini_file.py'
Feb 19 19:03:14 compute-0 sudo[57512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:14 compute-0 python3.9[57515]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:03:14 compute-0 sudo[57512]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:14 compute-0 sudo[57665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyacxarmfcmfljxqgjxsmjmkpuvvwske ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527794.7712631-377-55956365736900/AnsiballZ_ini_file.py'
Feb 19 19:03:14 compute-0 sudo[57665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:15 compute-0 python3.9[57668]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:03:15 compute-0 sudo[57665]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:15 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 19 19:03:15 compute-0 sudo[57818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxamiavjwesyrpdxstfdvulcvnimunnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527795.5100691-407-173866740686079/AnsiballZ_ini_file.py'
Feb 19 19:03:15 compute-0 sudo[57818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:15 compute-0 python3.9[57821]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:03:15 compute-0 sudo[57818]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:16 compute-0 sudo[57971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mppfggvoqlztduouzggfcviiaifyrhyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527796.0301497-407-252153247160425/AnsiballZ_ini_file.py'
Feb 19 19:03:16 compute-0 sudo[57971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:16 compute-0 python3.9[57974]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:03:16 compute-0 sudo[57971]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:16 compute-0 sudo[58124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exwaplkudgxpsdibojdrqcdxmmpldugr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527796.7422261-437-36133158220838/AnsiballZ_stat.py'
Feb 19 19:03:16 compute-0 sudo[58124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:17 compute-0 python3.9[58127]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:03:17 compute-0 sudo[58124]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:17 compute-0 sudo[58248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkjfksnnuvcthnjqklxdlfmciasxdvfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527796.7422261-437-36133158220838/AnsiballZ_copy.py'
Feb 19 19:03:17 compute-0 sudo[58248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:17 compute-0 python3.9[58251]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771527796.7422261-437-36133158220838/.source _original_basename=.zumh7y9x follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:03:17 compute-0 sudo[58248]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:18 compute-0 sudo[58401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jskshasjxsowcnogierhrvvsimgndhym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527797.8719053-467-119752314399479/AnsiballZ_file.py'
Feb 19 19:03:18 compute-0 sudo[58401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:18 compute-0 python3.9[58404]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:03:18 compute-0 sudo[58401]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:18 compute-0 sudo[58554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjenowtzxawsqupckjpjrersfhurxwoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527798.4612038-483-244902158695538/AnsiballZ_edpm_os_net_config_mappings.py'
Feb 19 19:03:18 compute-0 sudo[58554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:18 compute-0 python3.9[58557]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Feb 19 19:03:19 compute-0 sudo[58554]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:19 compute-0 sudo[58707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajqhkwhicyfrdxjpkbnbwcduwuxqatgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527799.218599-501-193867035109509/AnsiballZ_file.py'
Feb 19 19:03:19 compute-0 sudo[58707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:19 compute-0 python3.9[58710]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:03:19 compute-0 sudo[58707]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:20 compute-0 sudo[58860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yljudnbiwqzqbceisnhkuxassjyopsqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527799.9398205-521-112192595153411/AnsiballZ_stat.py'
Feb 19 19:03:20 compute-0 sudo[58860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:20 compute-0 sudo[58860]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:20 compute-0 sudo[58984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvcdvgwlqnhwikfrxbfpyffrqxrpktym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527799.9398205-521-112192595153411/AnsiballZ_copy.py'
Feb 19 19:03:20 compute-0 sudo[58984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:20 compute-0 sudo[58984]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:21 compute-0 sudo[59137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koiesshqhrfhmtlozlybgnfkjhlrteix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527801.0358484-551-155173178550133/AnsiballZ_slurp.py'
Feb 19 19:03:21 compute-0 sudo[59137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:21 compute-0 python3.9[59140]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Feb 19 19:03:21 compute-0 sudo[59137]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:22 compute-0 sudo[59313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrvtpafmoxlcrljwksgxgfbbhnudvmyh ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527801.8251774-569-17613333578896/async_wrapper.py j858541784365 300 /home/zuul/.ansible/tmp/ansible-tmp-1771527801.8251774-569-17613333578896/AnsiballZ_edpm_os_net_config.py _'
Feb 19 19:03:22 compute-0 sudo[59313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:22 compute-0 ansible-async_wrapper.py[59316]: Invoked with j858541784365 300 /home/zuul/.ansible/tmp/ansible-tmp-1771527801.8251774-569-17613333578896/AnsiballZ_edpm_os_net_config.py _
Feb 19 19:03:22 compute-0 ansible-async_wrapper.py[59319]: Starting module and watcher
Feb 19 19:03:22 compute-0 ansible-async_wrapper.py[59319]: Start watching 59320 (300)
Feb 19 19:03:22 compute-0 ansible-async_wrapper.py[59320]: Start module (59320)
Feb 19 19:03:22 compute-0 ansible-async_wrapper.py[59316]: Return async_wrapper task started.
Feb 19 19:03:22 compute-0 sudo[59313]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:22 compute-0 python3.9[59321]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True remove_config=False safe_defaults=False use_nmstate=True purge_provider=
Feb 19 19:03:23 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Feb 19 19:03:23 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Feb 19 19:03:23 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Feb 19 19:03:23 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Feb 19 19:03:23 compute-0 kernel: cfg80211: failed to load regulatory.db
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.1807] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.1825] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2263] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2264] audit: op="connection-add" uuid="813f3193-5bd3-446e-9a79-ad99602bfe3e" name="br-ex-br" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2278] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2279] audit: op="connection-add" uuid="9944028b-dde2-4568-a823-2dd0a7d04aa6" name="br-ex-port" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2289] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2290] audit: op="connection-add" uuid="3c2f9125-4832-4904-a354-0fc19ff2d9c7" name="eth1-port" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2299] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2300] audit: op="connection-add" uuid="f058da4e-c8a8-4814-949b-db660ad6ec2b" name="vlan20-port" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2310] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2311] audit: op="connection-add" uuid="62951437-8441-4557-a4d2-e6152874cb60" name="vlan21-port" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2320] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2321] audit: op="connection-add" uuid="5af3c8c7-6d80-43bb-a9e1-1350b0d2ee49" name="vlan22-port" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2337] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2351] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2352] audit: op="connection-add" uuid="0c4b76a2-ee0f-4d87-967c-e4235f988ee9" name="br-ex-if" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2409] audit: op="connection-update" uuid="4e789d3f-d1df-5b4d-a218-9665bd6d0d0e" name="ci-private-network" args="connection.master,connection.timestamp,connection.port-type,connection.controller,connection.slave-type,ipv4.routing-rules,ipv4.method,ipv4.addresses,ipv4.never-default,ipv4.dns,ipv4.routes,ovs-external-ids.data,ipv6.routing-rules,ipv6.method,ipv6.addr-gen-mode,ipv6.addresses,ipv6.dns,ipv6.routes,ovs-interface.type" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2422] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2424] audit: op="connection-add" uuid="5ac354c7-e203-480f-a383-dc5c8646884c" name="vlan20-if" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2436] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2438] audit: op="connection-add" uuid="001b03ab-d42f-471f-8fc5-f4532f9defdd" name="vlan21-if" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2451] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2452] audit: op="connection-add" uuid="82443504-e179-437f-83b1-e13ad0596574" name="vlan22-if" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2462] audit: op="connection-delete" uuid="1cf3a24b-f7c9-3493-9032-4c304d6d9dc5" name="Wired connection 1" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2471] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <warn>  [1771527804.2473] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2478] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2482] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (813f3193-5bd3-446e-9a79-ad99602bfe3e)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2482] audit: op="connection-activate" uuid="813f3193-5bd3-446e-9a79-ad99602bfe3e" name="br-ex-br" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2484] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <warn>  [1771527804.2484] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2490] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2493] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (9944028b-dde2-4568-a823-2dd0a7d04aa6)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2494] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <warn>  [1771527804.2495] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2499] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2503] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (3c2f9125-4832-4904-a354-0fc19ff2d9c7)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2504] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <warn>  [1771527804.2505] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2510] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2513] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (f058da4e-c8a8-4814-949b-db660ad6ec2b)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2514] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <warn>  [1771527804.2515] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2520] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2523] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (62951437-8441-4557-a4d2-e6152874cb60)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2525] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <warn>  [1771527804.2526] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2530] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2534] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (5af3c8c7-6d80-43bb-a9e1-1350b0d2ee49)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2534] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2537] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2538] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2543] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <warn>  [1771527804.2544] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2547] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2551] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (0c4b76a2-ee0f-4d87-967c-e4235f988ee9)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2551] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2554] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2556] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2557] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2558] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2567] device (eth1): disconnecting for new activation request.
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2567] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2570] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2572] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2573] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2575] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <warn>  [1771527804.2576] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2579] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2583] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (5ac354c7-e203-480f-a383-dc5c8646884c)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2583] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2586] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2588] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2589] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2591] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <warn>  [1771527804.2592] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2595] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2599] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (001b03ab-d42f-471f-8fc5-f4532f9defdd)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2599] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2602] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2603] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2605] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2608] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <warn>  [1771527804.2609] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2612] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2615] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (82443504-e179-437f-83b1-e13ad0596574)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2616] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2619] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2620] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2622] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2623] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2634] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.method,ipv6.addr-gen-mode" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2636] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2639] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2640] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2645] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2649] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2652] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2655] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2657] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2662] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2666] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2669] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 kernel: ovs-system: entered promiscuous mode
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2671] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2675] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2679] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 systemd-udevd[59326]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:03:24 compute-0 kernel: Timeout policy base is empty
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2683] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2685] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2691] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2695] dhcp4 (eth0): canceled DHCP transaction
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2695] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2695] dhcp4 (eth0): state changed no lease
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2697] dhcp4 (eth0): activation: beginning transaction (no timeout)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2704] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2708] audit: op="device-reapply" interface="eth1" ifindex=3 pid=59322 uid=0 result="fail" reason="Device is not activated"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2711] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Feb 19 19:03:24 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2742] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2746] dhcp4 (eth0): state changed new lease, address=38.102.83.219
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2751] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Feb 19 19:03:24 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2798] device (eth1): disconnecting for new activation request.
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2799] audit: op="connection-activate" uuid="4e789d3f-d1df-5b4d-a218-9665bd6d0d0e" name="ci-private-network" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2804] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2883] device (eth1): Activation: starting connection 'ci-private-network' (4e789d3f-d1df-5b4d-a218-9665bd6d0d0e)
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2900] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2905] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2910] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2912] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59322 uid=0 result="success"
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2913] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2914] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2918] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2919] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2921] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2924] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2930] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2933] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2937] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2941] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2945] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2949] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2953] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2956] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2960] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2963] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2967] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2970] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2975] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.2979] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 kernel: br-ex: entered promiscuous mode
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3055] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3057] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3061] device (eth1): Activation: successful, device activated.
Feb 19 19:03:24 compute-0 kernel: vlan22: entered promiscuous mode
Feb 19 19:03:24 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Feb 19 19:03:24 compute-0 systemd-udevd[59327]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:03:24 compute-0 kernel: vlan20: entered promiscuous mode
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3137] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3152] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3163] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 kernel: vlan21: entered promiscuous mode
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3180] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3194] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3213] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3223] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3228] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3233] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3238] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3242] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3247] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3255] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3259] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3263] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3271] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3289] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3329] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3331] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 19 19:03:24 compute-0 NetworkManager[56519]: <info>  [1771527804.3335] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 19 19:03:25 compute-0 NetworkManager[56519]: <info>  [1771527805.4449] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59322 uid=0 result="success"
Feb 19 19:03:25 compute-0 NetworkManager[56519]: <info>  [1771527805.5702] checkpoint[0x55f70a5d8950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Feb 19 19:03:25 compute-0 NetworkManager[56519]: <info>  [1771527805.5703] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59322 uid=0 result="success"
Feb 19 19:03:25 compute-0 NetworkManager[56519]: <info>  [1771527805.8229] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59322 uid=0 result="success"
Feb 19 19:03:25 compute-0 NetworkManager[56519]: <info>  [1771527805.8239] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59322 uid=0 result="success"
Feb 19 19:03:25 compute-0 NetworkManager[56519]: <info>  [1771527805.9855] audit: op="networking-control" arg="global-dns-configuration" pid=59322 uid=0 result="success"
Feb 19 19:03:25 compute-0 NetworkManager[56519]: <info>  [1771527805.9875] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Feb 19 19:03:25 compute-0 NetworkManager[56519]: <info>  [1771527805.9905] audit: op="networking-control" arg="global-dns-configuration" pid=59322 uid=0 result="success"
Feb 19 19:03:25 compute-0 NetworkManager[56519]: <info>  [1771527805.9920] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59322 uid=0 result="success"
Feb 19 19:03:26 compute-0 sudo[59658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gieebdktblknpmmeyzjgvrcazyadiosm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527805.7198844-569-199293145226406/AnsiballZ_async_status.py'
Feb 19 19:03:26 compute-0 sudo[59658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:26 compute-0 NetworkManager[56519]: <info>  [1771527806.1115] checkpoint[0x55f70a5d8a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Feb 19 19:03:26 compute-0 NetworkManager[56519]: <info>  [1771527806.1120] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59322 uid=0 result="success"
Feb 19 19:03:26 compute-0 ansible-async_wrapper.py[59320]: Module complete (59320)
Feb 19 19:03:26 compute-0 python3.9[59661]: ansible-ansible.legacy.async_status Invoked with jid=j858541784365.59316 mode=status _async_dir=/root/.ansible_async
Feb 19 19:03:26 compute-0 sudo[59658]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:26 compute-0 sudo[59758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbvlnphhvmvhplchhthrezgysahyzfde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527805.7198844-569-199293145226406/AnsiballZ_async_status.py'
Feb 19 19:03:26 compute-0 sudo[59758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:26 compute-0 python3.9[59761]: ansible-ansible.legacy.async_status Invoked with jid=j858541784365.59316 mode=cleanup _async_dir=/root/.ansible_async
Feb 19 19:03:26 compute-0 sudo[59758]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:27 compute-0 ansible-async_wrapper.py[59319]: Done in kid B.
Feb 19 19:03:30 compute-0 sudo[59913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iluvfwnzxbynrkzafufypsdrtrdvrtog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527810.4874547-618-231137051198040/AnsiballZ_stat.py'
Feb 19 19:03:30 compute-0 sudo[59913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:30 compute-0 python3.9[59916]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:03:30 compute-0 sudo[59913]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:31 compute-0 sudo[60037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axwmtanfyhdgyytctqcolfhepjudvzbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527810.4874547-618-231137051198040/AnsiballZ_copy.py'
Feb 19 19:03:31 compute-0 sudo[60037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:31 compute-0 python3.9[60040]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771527810.4874547-618-231137051198040/.source.returncode _original_basename=.ld2fzgap follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:03:31 compute-0 sudo[60037]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:31 compute-0 sudo[60190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utwojhpbtqdafsdhxkuvloxngzzdjfrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527811.5997105-650-250921401573250/AnsiballZ_stat.py'
Feb 19 19:03:31 compute-0 sudo[60190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:31 compute-0 python3.9[60193]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:03:31 compute-0 sudo[60190]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:32 compute-0 sudo[60314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xahakkanvnvxuknpukiaqfpqrrdqardg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527811.5997105-650-250921401573250/AnsiballZ_copy.py'
Feb 19 19:03:32 compute-0 sudo[60314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:32 compute-0 python3.9[60317]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771527811.5997105-650-250921401573250/.source.cfg _original_basename=.1phajn00 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:03:32 compute-0 sudo[60314]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:32 compute-0 sudo[60467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toklucwsrezygzfaoikonijixlhciszs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527812.722118-680-108840975145103/AnsiballZ_systemd.py'
Feb 19 19:03:32 compute-0 sudo[60467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:33 compute-0 python3.9[60470]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:03:33 compute-0 systemd[1]: Reloading Network Manager...
Feb 19 19:03:33 compute-0 NetworkManager[56519]: <info>  [1771527813.2474] audit: op="reload" arg="0" pid=60475 uid=0 result="success"
Feb 19 19:03:33 compute-0 NetworkManager[56519]: <info>  [1771527813.2481] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Feb 19 19:03:33 compute-0 systemd[1]: Reloaded Network Manager.
Feb 19 19:03:33 compute-0 sudo[60467]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:33 compute-0 sshd-session[52463]: Connection closed by 192.168.122.30 port 58732
Feb 19 19:03:33 compute-0 sshd-session[52460]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:03:33 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Feb 19 19:03:33 compute-0 systemd[1]: session-12.scope: Consumed 41.121s CPU time.
Feb 19 19:03:33 compute-0 systemd-logind[822]: Session 12 logged out. Waiting for processes to exit.
Feb 19 19:03:33 compute-0 systemd-logind[822]: Removed session 12.
Feb 19 19:03:35 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 19 19:03:39 compute-0 sshd-session[60508]: Accepted publickey for zuul from 192.168.122.30 port 52560 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:03:39 compute-0 systemd-logind[822]: New session 13 of user zuul.
Feb 19 19:03:39 compute-0 systemd[1]: Started Session 13 of User zuul.
Feb 19 19:03:39 compute-0 sshd-session[60508]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:03:40 compute-0 python3.9[60662]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:03:40 compute-0 python3.9[60816]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 19:03:41 compute-0 python3.9[61005]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:03:42 compute-0 sshd-session[60511]: Connection closed by 192.168.122.30 port 52560
Feb 19 19:03:42 compute-0 sshd-session[60508]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:03:42 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Feb 19 19:03:42 compute-0 systemd[1]: session-13.scope: Consumed 1.746s CPU time.
Feb 19 19:03:42 compute-0 systemd-logind[822]: Session 13 logged out. Waiting for processes to exit.
Feb 19 19:03:42 compute-0 systemd-logind[822]: Removed session 13.
Feb 19 19:03:43 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 19 19:03:47 compute-0 sshd-session[61035]: Accepted publickey for zuul from 192.168.122.30 port 36394 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:03:47 compute-0 systemd-logind[822]: New session 14 of user zuul.
Feb 19 19:03:47 compute-0 systemd[1]: Started Session 14 of User zuul.
Feb 19 19:03:47 compute-0 sshd-session[61035]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:03:48 compute-0 python3.9[61188]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:03:48 compute-0 python3.9[61342]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:03:49 compute-0 sudo[61497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cquvjpkxkwwtzqiubtzrpcbssbemrgio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527829.2894108-55-91120416708813/AnsiballZ_setup.py'
Feb 19 19:03:49 compute-0 sudo[61497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:49 compute-0 python3.9[61500]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 19:03:50 compute-0 sudo[61497]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:50 compute-0 sudo[61582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioptgoumkzuewqbiioncqrgdlkgzpise ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527829.2894108-55-91120416708813/AnsiballZ_dnf.py'
Feb 19 19:03:50 compute-0 sudo[61582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:50 compute-0 python3.9[61585]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:03:51 compute-0 sudo[61582]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:52 compute-0 sudo[61736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvxyliguiczjtocqhjraylbwsdxkdmou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527831.9391766-79-120461896541566/AnsiballZ_setup.py'
Feb 19 19:03:52 compute-0 sudo[61736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:52 compute-0 python3.9[61739]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 19:03:52 compute-0 sudo[61736]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:53 compute-0 sudo[61929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djnhlpsugcciebiqinyskpcoxxmoryln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527832.912995-101-114996657605073/AnsiballZ_file.py'
Feb 19 19:03:53 compute-0 sudo[61929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:53 compute-0 python3.9[61932]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:03:53 compute-0 sudo[61929]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:54 compute-0 sudo[62082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykgaxylbnvwsmqvqteeivyxsmdbinevc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527833.6700757-117-61849345857863/AnsiballZ_command.py'
Feb 19 19:03:54 compute-0 sudo[62082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:54 compute-0 python3.9[62085]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:03:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:03:54 compute-0 sudo[62082]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:54 compute-0 sudo[62246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rybyflgdrquxkwcmraynaovjyhxgfuzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527834.4583125-133-74802413306464/AnsiballZ_stat.py'
Feb 19 19:03:54 compute-0 sudo[62246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:54 compute-0 python3.9[62249]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:03:54 compute-0 sudo[62246]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:55 compute-0 sudo[62325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucbzkcyldkvfupmcddomhhkgqqjjjavn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527834.4583125-133-74802413306464/AnsiballZ_file.py'
Feb 19 19:03:55 compute-0 sudo[62325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:55 compute-0 python3.9[62328]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:03:55 compute-0 sudo[62325]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:55 compute-0 sudo[62479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkulrbwsbnynxyautdxqtjtfgtztyaoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527835.5264497-157-154842719905892/AnsiballZ_stat.py'
Feb 19 19:03:55 compute-0 sudo[62479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:55 compute-0 python3.9[62482]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:03:56 compute-0 sudo[62479]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:56 compute-0 sudo[62558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvlwtwfrnnyhbbrttovglhcqgpioytei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527835.5264497-157-154842719905892/AnsiballZ_file.py'
Feb 19 19:03:56 compute-0 sudo[62558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:56 compute-0 python3.9[62561]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:03:56 compute-0 sudo[62558]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:56 compute-0 sudo[62711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvvjdjiycjveplerwiveiuazcovqqiwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527836.5377126-183-254362662566203/AnsiballZ_ini_file.py'
Feb 19 19:03:56 compute-0 sudo[62711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:57 compute-0 python3.9[62714]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:03:57 compute-0 sudo[62711]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:57 compute-0 sudo[62864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioyzbzypywjmetoefhhpmxdictwwmaab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527837.2160952-183-206912247977416/AnsiballZ_ini_file.py'
Feb 19 19:03:57 compute-0 sudo[62864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:57 compute-0 python3.9[62867]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:03:57 compute-0 sudo[62864]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:57 compute-0 sudo[63017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxavwodiwxdzfnpjcyeacumzsrzwmpwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527837.7235358-183-20243078162259/AnsiballZ_ini_file.py'
Feb 19 19:03:57 compute-0 sudo[63017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:58 compute-0 python3.9[63020]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:03:58 compute-0 sudo[63017]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:58 compute-0 sudo[63170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsjfoustcliqrwgczphkilororhtkqdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527838.2049534-183-131484326843326/AnsiballZ_ini_file.py'
Feb 19 19:03:58 compute-0 sudo[63170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:58 compute-0 python3.9[63173]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:03:58 compute-0 sudo[63170]: pam_unix(sudo:session): session closed for user root
Feb 19 19:03:59 compute-0 sudo[63323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbygwtqdvkvabelclqzetrehnlkhiftb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527839.289238-245-224111501145923/AnsiballZ_dnf.py'
Feb 19 19:03:59 compute-0 sudo[63323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:03:59 compute-0 python3.9[63326]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:04:00 compute-0 sudo[63323]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:01 compute-0 sudo[63477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpugsdhnkzrnjpilcgvrzcnhwlbuydji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527841.567268-267-33226733024629/AnsiballZ_setup.py'
Feb 19 19:04:01 compute-0 sudo[63477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:02 compute-0 python3.9[63480]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:04:02 compute-0 sudo[63477]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:02 compute-0 sudo[63632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaloanrsqjbbizqpulxivwsqjnpexokl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527842.33696-283-88721230359372/AnsiballZ_stat.py'
Feb 19 19:04:02 compute-0 sudo[63632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:02 compute-0 python3.9[63635]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:04:02 compute-0 sudo[63632]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:03 compute-0 sudo[63785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otrqrnscyyhpahntfazghmawlwvtpsgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527842.957726-301-133408286294437/AnsiballZ_stat.py'
Feb 19 19:04:03 compute-0 sudo[63785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:03 compute-0 python3.9[63788]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:04:03 compute-0 sudo[63785]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:03 compute-0 sudo[63938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esfoqgvmzjubtwbkiafatggmufcsqvuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527843.7319105-321-36610886718867/AnsiballZ_command.py'
Feb 19 19:04:03 compute-0 sudo[63938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:04 compute-0 python3.9[63941]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:04:04 compute-0 sudo[63938]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:04 compute-0 sudo[64092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzkxqykdswkugibqiteyelhbdyrmzfnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527844.4793756-341-164765753609072/AnsiballZ_service_facts.py'
Feb 19 19:04:04 compute-0 sudo[64092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:04 compute-0 python3.9[64095]: ansible-service_facts Invoked
Feb 19 19:04:05 compute-0 network[64112]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 19 19:04:05 compute-0 network[64113]: 'network-scripts' will be removed from distribution in near future.
Feb 19 19:04:05 compute-0 network[64114]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 19 19:04:07 compute-0 sudo[64092]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:09 compute-0 sudo[64398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsiiutzkgdjxaaeniphmlxixesrlppdk ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1771527849.1008017-371-94620214520384/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1771527849.1008017-371-94620214520384/args'
Feb 19 19:04:09 compute-0 sudo[64398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:09 compute-0 sudo[64398]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:09 compute-0 sshd-session[64438]: Received disconnect from 43.166.137.151 port 42822:11: Bye Bye [preauth]
Feb 19 19:04:09 compute-0 sshd-session[64438]: Disconnected from authenticating user root 43.166.137.151 port 42822 [preauth]
Feb 19 19:04:09 compute-0 sudo[64568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kufhfvgucudtlwocyezgyuqruucxelso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527849.707998-393-69204554647247/AnsiballZ_dnf.py'
Feb 19 19:04:09 compute-0 sudo[64568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:10 compute-0 python3.9[64571]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:04:11 compute-0 sudo[64568]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:12 compute-0 sudo[64722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnfmqvytqztuubekdskwowmabqymwvri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527851.7687047-419-90436309742010/AnsiballZ_package_facts.py'
Feb 19 19:04:12 compute-0 sudo[64722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:12 compute-0 python3.9[64725]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb 19 19:04:12 compute-0 sudo[64722]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:13 compute-0 sudo[64875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqbaskrfkyiisigfxldxkgqtrnvdehxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527853.439597-439-40308132954812/AnsiballZ_stat.py'
Feb 19 19:04:13 compute-0 sudo[64875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:13 compute-0 python3.9[64878]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:13 compute-0 sudo[64875]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:14 compute-0 sudo[65001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htyvnwmhraoivvvwzfmmhayryloeqcpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527853.439597-439-40308132954812/AnsiballZ_copy.py'
Feb 19 19:04:14 compute-0 sudo[65001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:14 compute-0 python3.9[65004]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771527853.439597-439-40308132954812/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:14 compute-0 sudo[65001]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:15 compute-0 sudo[65156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiagqdbmjcgpxfccloporniriqwqucqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527854.738769-469-187992312882731/AnsiballZ_stat.py'
Feb 19 19:04:15 compute-0 sudo[65156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:15 compute-0 python3.9[65159]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:15 compute-0 sudo[65156]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:15 compute-0 sudo[65282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmelxuhxcsvqicpvobmmnaeihradhdki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527854.738769-469-187992312882731/AnsiballZ_copy.py'
Feb 19 19:04:15 compute-0 sudo[65282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:15 compute-0 python3.9[65285]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771527854.738769-469-187992312882731/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:15 compute-0 sudo[65282]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:16 compute-0 sudo[65437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fybzeykkgkeuizbvmditbgnbnchvnhjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527856.5023007-511-25668487269660/AnsiballZ_lineinfile.py'
Feb 19 19:04:16 compute-0 sudo[65437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:17 compute-0 python3.9[65440]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:17 compute-0 sudo[65437]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:18 compute-0 sudo[65592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeotipusiffkxbwczccpbiddfzhamyyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527857.7922585-541-247525412231325/AnsiballZ_setup.py'
Feb 19 19:04:18 compute-0 sudo[65592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:18 compute-0 python3.9[65595]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 19:04:18 compute-0 sudo[65592]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:18 compute-0 sudo[65677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svyqnakxljdpysvhgbuehwpvrxfmplrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527857.7922585-541-247525412231325/AnsiballZ_systemd.py'
Feb 19 19:04:18 compute-0 sudo[65677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:19 compute-0 python3.9[65680]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:04:19 compute-0 sudo[65677]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:20 compute-0 sudo[65832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdptlwoaulldodkersvivhgullyhfbbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527859.8186061-573-35624926412458/AnsiballZ_setup.py'
Feb 19 19:04:20 compute-0 sudo[65832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:20 compute-0 python3.9[65835]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 19:04:20 compute-0 sudo[65832]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:20 compute-0 sudo[65917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inuqtbbatvvhzdvfxrejmghrbpfxtvtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527859.8186061-573-35624926412458/AnsiballZ_systemd.py'
Feb 19 19:04:20 compute-0 sudo[65917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:21 compute-0 python3.9[65920]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:04:21 compute-0 chronyd[830]: chronyd exiting
Feb 19 19:04:21 compute-0 systemd[1]: Stopping NTP client/server...
Feb 19 19:04:21 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Feb 19 19:04:21 compute-0 systemd[1]: Stopped NTP client/server.
Feb 19 19:04:21 compute-0 systemd[1]: Starting NTP client/server...
Feb 19 19:04:21 compute-0 chronyd[65928]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 19 19:04:21 compute-0 chronyd[65928]: Frequency -23.259 +/- 0.220 ppm read from /var/lib/chrony/drift
Feb 19 19:04:21 compute-0 chronyd[65928]: Loaded seccomp filter (level 2)
Feb 19 19:04:21 compute-0 systemd[1]: Started NTP client/server.
Feb 19 19:04:21 compute-0 sudo[65917]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:21 compute-0 sshd-session[61038]: Connection closed by 192.168.122.30 port 36394
Feb 19 19:04:21 compute-0 sshd-session[61035]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:04:21 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Feb 19 19:04:21 compute-0 systemd[1]: session-14.scope: Consumed 21.441s CPU time.
Feb 19 19:04:21 compute-0 systemd-logind[822]: Session 14 logged out. Waiting for processes to exit.
Feb 19 19:04:21 compute-0 systemd-logind[822]: Removed session 14.
Feb 19 19:04:26 compute-0 sshd-session[65954]: Accepted publickey for zuul from 192.168.122.30 port 59654 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:04:26 compute-0 systemd-logind[822]: New session 15 of user zuul.
Feb 19 19:04:26 compute-0 systemd[1]: Started Session 15 of User zuul.
Feb 19 19:04:26 compute-0 sshd-session[65954]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:04:27 compute-0 python3.9[66107]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:04:28 compute-0 sudo[66261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikivykcywdylgbunklwebiqribhawksd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527868.1047082-41-152944061790517/AnsiballZ_file.py'
Feb 19 19:04:28 compute-0 sudo[66261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:28 compute-0 python3.9[66264]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:28 compute-0 sudo[66261]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:29 compute-0 sudo[66437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tioyjcapnaaqtoxdsukmxynejnmbyqno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527868.8972738-57-138194102394021/AnsiballZ_stat.py'
Feb 19 19:04:29 compute-0 sudo[66437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:29 compute-0 python3.9[66440]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:29 compute-0 sudo[66437]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:29 compute-0 sudo[66516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntbkovefmjlwdfgffcyqrdpdwtpwazjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527868.8972738-57-138194102394021/AnsiballZ_file.py'
Feb 19 19:04:29 compute-0 sudo[66516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:29 compute-0 python3.9[66519]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.tdv83ypg recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:29 compute-0 sudo[66516]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:30 compute-0 sudo[66669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqdgoobxgahftcrnkyoebruocbhxbrgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527870.2591317-97-88053827837730/AnsiballZ_stat.py'
Feb 19 19:04:30 compute-0 sudo[66669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:30 compute-0 python3.9[66672]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:30 compute-0 sudo[66669]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:31 compute-0 sudo[66793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vasobpcfblitqgxnsnaqlqgbllkmenxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527870.2591317-97-88053827837730/AnsiballZ_copy.py'
Feb 19 19:04:31 compute-0 sudo[66793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:31 compute-0 python3.9[66796]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771527870.2591317-97-88053827837730/.source _original_basename=.4aylwe_h follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:31 compute-0 sudo[66793]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:32 compute-0 sudo[66946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovzqwimztrvekghfixlqdtevdfcpagqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527872.1176596-129-236505915831629/AnsiballZ_file.py'
Feb 19 19:04:32 compute-0 sudo[66946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:32 compute-0 python3.9[66949]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:04:32 compute-0 sudo[66946]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:33 compute-0 sudo[67101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koftgpxpnczzfrlkoawjriojafsxwoxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527872.8662674-145-63119167143015/AnsiballZ_stat.py'
Feb 19 19:04:33 compute-0 sudo[67101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:33 compute-0 python3.9[67104]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:33 compute-0 sudo[67101]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:33 compute-0 sshd-session[66974]: Received disconnect from 45.148.10.147 port 17012:11:  [preauth]
Feb 19 19:04:33 compute-0 sshd-session[66974]: Disconnected from authenticating user root 45.148.10.147 port 17012 [preauth]
Feb 19 19:04:33 compute-0 sudo[67227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxljrynvixokbxnostoxvytklewycutg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527872.8662674-145-63119167143015/AnsiballZ_copy.py'
Feb 19 19:04:33 compute-0 sudo[67227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:33 compute-0 python3.9[67230]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771527872.8662674-145-63119167143015/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:04:33 compute-0 sudo[67227]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:34 compute-0 sudo[67380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgcfznmhaclhiacdfraqpcepeaulqnzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527873.9571133-145-171299300170852/AnsiballZ_stat.py'
Feb 19 19:04:34 compute-0 sudo[67380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:34 compute-0 python3.9[67383]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:34 compute-0 sudo[67380]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:34 compute-0 sudo[67504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guoitehqrsmvnxdpsapschzvgztjmbpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527873.9571133-145-171299300170852/AnsiballZ_copy.py'
Feb 19 19:04:34 compute-0 sudo[67504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:34 compute-0 sshd-session[67105]: Received disconnect from 182.75.216.74 port 6579:11: Bye Bye [preauth]
Feb 19 19:04:34 compute-0 sshd-session[67105]: Disconnected from authenticating user root 182.75.216.74 port 6579 [preauth]
Feb 19 19:04:34 compute-0 python3.9[67507]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771527873.9571133-145-171299300170852/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:04:34 compute-0 sudo[67504]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:35 compute-0 sudo[67657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dczowislavyqpdpuypmodbgadfokdhbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527875.1226535-203-62072535424103/AnsiballZ_file.py'
Feb 19 19:04:35 compute-0 sudo[67657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:35 compute-0 python3.9[67660]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:35 compute-0 sudo[67657]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:36 compute-0 sudo[67810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imnzqciobvqnjjqnbydigefvntptneje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527875.7970614-219-135722225275494/AnsiballZ_stat.py'
Feb 19 19:04:36 compute-0 sudo[67810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:36 compute-0 python3.9[67813]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:36 compute-0 sudo[67810]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:36 compute-0 sudo[67934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkrcihypvdscipgudvhaqygekqfnxwxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527875.7970614-219-135722225275494/AnsiballZ_copy.py'
Feb 19 19:04:36 compute-0 sudo[67934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:36 compute-0 python3.9[67937]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527875.7970614-219-135722225275494/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:36 compute-0 sudo[67934]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:37 compute-0 sudo[68087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scjngdhzpuzrbhlygztekzaciaroflec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527876.9791143-249-198633441551254/AnsiballZ_stat.py'
Feb 19 19:04:37 compute-0 sudo[68087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:37 compute-0 python3.9[68090]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:37 compute-0 sudo[68087]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:37 compute-0 sudo[68211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atisdjsxtgcsgeahmyufqbujuutsicpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527876.9791143-249-198633441551254/AnsiballZ_copy.py'
Feb 19 19:04:37 compute-0 sudo[68211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:38 compute-0 python3.9[68214]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527876.9791143-249-198633441551254/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:38 compute-0 sudo[68211]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:38 compute-0 sudo[68364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffdlxmvlmyjtmuwxhscezenephotwjka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527878.3203495-279-223027113123160/AnsiballZ_systemd.py'
Feb 19 19:04:38 compute-0 sudo[68364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:39 compute-0 python3.9[68367]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:04:39 compute-0 systemd[1]: Reloading.
Feb 19 19:04:39 compute-0 systemd-sysv-generator[68393]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:04:39 compute-0 systemd-rc-local-generator[68388]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:04:39 compute-0 systemd[1]: Reloading.
Feb 19 19:04:39 compute-0 systemd-rc-local-generator[68446]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:04:39 compute-0 systemd-sysv-generator[68451]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:04:39 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Feb 19 19:04:39 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Feb 19 19:04:39 compute-0 sudo[68364]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:40 compute-0 sudo[68607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdwsutqhiemyjkkwhxozciegfmowvyxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527879.8158195-295-238680114783737/AnsiballZ_stat.py'
Feb 19 19:04:40 compute-0 sudo[68607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:40 compute-0 python3.9[68610]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:40 compute-0 sudo[68607]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:40 compute-0 sudo[68731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eczdbogfrnvsoiquhfxsmqyodvkqieyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527879.8158195-295-238680114783737/AnsiballZ_copy.py'
Feb 19 19:04:40 compute-0 sudo[68731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:40 compute-0 python3.9[68734]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527879.8158195-295-238680114783737/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:40 compute-0 sudo[68731]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:41 compute-0 sudo[68884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpefrfreoxpqvuqajldoxvuthbuvhben ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527880.9623504-325-201758621830132/AnsiballZ_stat.py'
Feb 19 19:04:41 compute-0 sudo[68884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:41 compute-0 python3.9[68887]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:41 compute-0 sudo[68884]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:41 compute-0 sudo[69008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akbhysqayhzoywvzxzozavnqguhjspxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527880.9623504-325-201758621830132/AnsiballZ_copy.py'
Feb 19 19:04:41 compute-0 sudo[69008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:41 compute-0 python3.9[69011]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527880.9623504-325-201758621830132/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:41 compute-0 sudo[69008]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:42 compute-0 sudo[69161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwjcvyzzvbzgxxfzrdqwiqgevyzygilr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527882.1325831-355-205887185427717/AnsiballZ_systemd.py'
Feb 19 19:04:42 compute-0 sudo[69161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:42 compute-0 python3.9[69164]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:04:42 compute-0 systemd[1]: Reloading.
Feb 19 19:04:42 compute-0 systemd-rc-local-generator[69188]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:04:42 compute-0 systemd-sysv-generator[69194]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:04:42 compute-0 systemd[1]: Reloading.
Feb 19 19:04:42 compute-0 systemd-rc-local-generator[69236]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:04:42 compute-0 systemd-sysv-generator[69240]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:04:43 compute-0 systemd[1]: Starting Create netns directory...
Feb 19 19:04:43 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 19 19:04:43 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 19 19:04:43 compute-0 systemd[1]: Finished Create netns directory.
Feb 19 19:04:43 compute-0 sudo[69161]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:43 compute-0 python3.9[69404]: ansible-ansible.builtin.service_facts Invoked
Feb 19 19:04:43 compute-0 network[69421]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 19 19:04:43 compute-0 network[69422]: 'network-scripts' will be removed from distribution in near future.
Feb 19 19:04:43 compute-0 network[69423]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 19 19:04:47 compute-0 sudo[69684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgsfbsqgencrrxemmyfwkpcmjvpojbna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527886.963545-387-250158447872349/AnsiballZ_systemd.py'
Feb 19 19:04:47 compute-0 sudo[69684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:47 compute-0 python3.9[69687]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:04:48 compute-0 systemd[1]: Reloading.
Feb 19 19:04:48 compute-0 systemd-sysv-generator[69714]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:04:48 compute-0 systemd-rc-local-generator[69707]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:04:48 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Feb 19 19:04:49 compute-0 iptables.init[69734]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Feb 19 19:04:49 compute-0 iptables.init[69734]: iptables: Flushing firewall rules: [  OK  ]
Feb 19 19:04:49 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Feb 19 19:04:49 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Feb 19 19:04:49 compute-0 sudo[69684]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:49 compute-0 sudo[69928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adribhdfjvjjkckdvlmwuhmwablnzqbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527889.2761242-387-260430641452549/AnsiballZ_systemd.py'
Feb 19 19:04:49 compute-0 sudo[69928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:49 compute-0 python3.9[69931]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:04:49 compute-0 sudo[69928]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:50 compute-0 sudo[70083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juclspnitmfxvvmmnqcwkdqreolrpdbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527890.15974-419-134774560409760/AnsiballZ_systemd.py'
Feb 19 19:04:50 compute-0 sudo[70083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:50 compute-0 python3.9[70086]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:04:50 compute-0 systemd[1]: Reloading.
Feb 19 19:04:50 compute-0 systemd-rc-local-generator[70110]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:04:50 compute-0 systemd-sysv-generator[70114]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:04:51 compute-0 systemd[1]: Starting Netfilter Tables...
Feb 19 19:04:51 compute-0 systemd[1]: Finished Netfilter Tables.
Feb 19 19:04:51 compute-0 sudo[70083]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:51 compute-0 sudo[70283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdhxpblghiffgigmyikphvcldlbkkvxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527891.290127-435-60261506340595/AnsiballZ_command.py'
Feb 19 19:04:51 compute-0 sudo[70283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:51 compute-0 python3.9[70286]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:04:51 compute-0 sudo[70283]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:52 compute-0 sudo[70437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvvtdeqjvswplfpzpmdiimnsfuzzxogr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527892.3822937-463-247197477668992/AnsiballZ_stat.py'
Feb 19 19:04:52 compute-0 sudo[70437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:52 compute-0 python3.9[70440]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:52 compute-0 sudo[70437]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:53 compute-0 sudo[70563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqnzdfwrouofcewzxecvkcqmpxjovioj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527892.3822937-463-247197477668992/AnsiballZ_copy.py'
Feb 19 19:04:53 compute-0 sudo[70563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:53 compute-0 python3.9[70566]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771527892.3822937-463-247197477668992/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:53 compute-0 sudo[70563]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:53 compute-0 sudo[70717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-madmqjtfoemmcabkyscgziwynkdrqeys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527893.7007008-493-141785084721497/AnsiballZ_systemd.py'
Feb 19 19:04:53 compute-0 sudo[70717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:54 compute-0 python3.9[70720]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:04:54 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Feb 19 19:04:54 compute-0 sshd[1020]: Received SIGHUP; restarting.
Feb 19 19:04:54 compute-0 sshd[1020]: Server listening on 0.0.0.0 port 22.
Feb 19 19:04:54 compute-0 sshd[1020]: Server listening on :: port 22.
Feb 19 19:04:54 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Feb 19 19:04:54 compute-0 sudo[70717]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:54 compute-0 sudo[70874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obydddxpyiqnpoloqtvqhqcvgmhibywa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527894.5497136-509-269096606628692/AnsiballZ_file.py'
Feb 19 19:04:54 compute-0 sudo[70874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:54 compute-0 python3.9[70877]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:55 compute-0 sudo[70874]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:55 compute-0 sudo[71027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acggnqgyryymjtwdcxfwawmcsdutvwmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527895.3374856-525-201275582632780/AnsiballZ_stat.py'
Feb 19 19:04:55 compute-0 sudo[71027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:55 compute-0 python3.9[71030]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:55 compute-0 sudo[71027]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:56 compute-0 sudo[71151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ureeoobqbaoibnewqyhrcdqsxejthyxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527895.3374856-525-201275582632780/AnsiballZ_copy.py'
Feb 19 19:04:56 compute-0 sudo[71151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:56 compute-0 python3.9[71154]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527895.3374856-525-201275582632780/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:56 compute-0 sudo[71151]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:57 compute-0 sudo[71304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hygtmphsmfnducaendadlynxgefnqgcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527896.842197-561-35416873708111/AnsiballZ_timezone.py'
Feb 19 19:04:57 compute-0 sudo[71304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:57 compute-0 python3.9[71307]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 19 19:04:57 compute-0 systemd[1]: Starting Time & Date Service...
Feb 19 19:04:57 compute-0 systemd[1]: Started Time & Date Service.
Feb 19 19:04:57 compute-0 sudo[71304]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:58 compute-0 sudo[71461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sugpftxlxkvjfwbcnsmjeyarhvjzfcsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527897.8827846-579-49921917210248/AnsiballZ_file.py'
Feb 19 19:04:58 compute-0 sudo[71461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:58 compute-0 python3.9[71464]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:58 compute-0 sudo[71461]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:58 compute-0 sudo[71614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arfmqbvekzxzagafpriwkxgbauvgqqyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527898.4538329-595-29516467943669/AnsiballZ_stat.py'
Feb 19 19:04:58 compute-0 sudo[71614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:58 compute-0 python3.9[71617]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:58 compute-0 sudo[71614]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:59 compute-0 sudo[71738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxedxjxgcycloyukihvdhmcenwqddtjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527898.4538329-595-29516467943669/AnsiballZ_copy.py'
Feb 19 19:04:59 compute-0 sudo[71738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:59 compute-0 python3.9[71741]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771527898.4538329-595-29516467943669/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:04:59 compute-0 sudo[71738]: pam_unix(sudo:session): session closed for user root
Feb 19 19:04:59 compute-0 sudo[71891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpnbcopmjfbdmccwskbwkrpysweeirci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527899.502564-625-278428768499683/AnsiballZ_stat.py'
Feb 19 19:04:59 compute-0 sudo[71891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:04:59 compute-0 python3.9[71894]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:04:59 compute-0 sudo[71891]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:00 compute-0 sudo[72015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckyvuehabdrkhveecptoasjznzvqtbne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527899.502564-625-278428768499683/AnsiballZ_copy.py'
Feb 19 19:05:00 compute-0 sudo[72015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:00 compute-0 python3.9[72018]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771527899.502564-625-278428768499683/.source.yaml _original_basename=.ig36_gjl follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:00 compute-0 sudo[72015]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:00 compute-0 sudo[72168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahxxehxmedevvoilknnshobnazxacofh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527900.6772423-655-204976213688761/AnsiballZ_stat.py'
Feb 19 19:05:00 compute-0 sudo[72168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:01 compute-0 python3.9[72171]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:05:01 compute-0 sudo[72168]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:01 compute-0 sudo[72292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcddnpcjjzkrthelcoljgvmpudunqflv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527900.6772423-655-204976213688761/AnsiballZ_copy.py'
Feb 19 19:05:01 compute-0 sudo[72292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:01 compute-0 python3.9[72295]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527900.6772423-655-204976213688761/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:01 compute-0 sudo[72292]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:02 compute-0 sudo[72445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfhtmdyqawvvzcossagyqtzflcgvhluc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527901.79389-685-140923812904064/AnsiballZ_command.py'
Feb 19 19:05:02 compute-0 sudo[72445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:02 compute-0 python3.9[72448]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:05:02 compute-0 sudo[72445]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:02 compute-0 sudo[72599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svlyvmfwgbszhnrcgnabcdlyqnhrkkft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527902.4407153-701-82739858953192/AnsiballZ_command.py'
Feb 19 19:05:02 compute-0 sudo[72599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:02 compute-0 python3.9[72602]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:05:02 compute-0 sudo[72599]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:03 compute-0 sudo[72753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfxdsdsizrqcgzbornykqvsbuvgrykao ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771527903.070448-717-275920560731913/AnsiballZ_edpm_nftables_from_files.py'
Feb 19 19:05:03 compute-0 sudo[72753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:03 compute-0 python3[72756]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 19 19:05:03 compute-0 sudo[72753]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:04 compute-0 sudo[72906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyngjxsvzcgnmxhngsffkixyqjvgkksc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527903.873604-733-32546340192559/AnsiballZ_stat.py'
Feb 19 19:05:04 compute-0 sudo[72906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:04 compute-0 python3.9[72909]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:05:04 compute-0 sudo[72906]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:04 compute-0 sudo[73030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxddyfqytzxhpessbgglhfzhpurqljws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527903.873604-733-32546340192559/AnsiballZ_copy.py'
Feb 19 19:05:04 compute-0 sudo[73030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:05 compute-0 python3.9[73033]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527903.873604-733-32546340192559/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:05 compute-0 sudo[73030]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:05 compute-0 sudo[73183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzqlvcjkcsihyfxabuderdvlgnjwoyex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527905.2333248-763-122910817925395/AnsiballZ_stat.py'
Feb 19 19:05:05 compute-0 sudo[73183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:05 compute-0 python3.9[73186]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:05:05 compute-0 sudo[73183]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:06 compute-0 sudo[73307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-royellomupxvcthmdifginbdhxuxzidu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527905.2333248-763-122910817925395/AnsiballZ_copy.py'
Feb 19 19:05:06 compute-0 sudo[73307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:06 compute-0 python3.9[73310]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527905.2333248-763-122910817925395/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:06 compute-0 sudo[73307]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:06 compute-0 sudo[73460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hovhgihfiqcmjvfwpdikyasxswvkybmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527906.382045-793-245723516288251/AnsiballZ_stat.py'
Feb 19 19:05:06 compute-0 sudo[73460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:06 compute-0 python3.9[73463]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:05:06 compute-0 sudo[73460]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:06 compute-0 sudo[73584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glwyirxkkuxuiuqlomeiteqwgboozicf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527906.382045-793-245723516288251/AnsiballZ_copy.py'
Feb 19 19:05:06 compute-0 sudo[73584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:07 compute-0 python3.9[73587]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527906.382045-793-245723516288251/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:07 compute-0 sudo[73584]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:07 compute-0 sudo[73737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voeuanpryiggssehlzcikftmgchfoxyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527907.5016212-823-93305892616704/AnsiballZ_stat.py'
Feb 19 19:05:07 compute-0 sudo[73737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:07 compute-0 python3.9[73740]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:05:07 compute-0 sudo[73737]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:08 compute-0 sudo[73861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkevtjobbzskuoeuavfgbaljinvcbily ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527907.5016212-823-93305892616704/AnsiballZ_copy.py'
Feb 19 19:05:08 compute-0 sudo[73861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:08 compute-0 python3.9[73864]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527907.5016212-823-93305892616704/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:08 compute-0 sudo[73861]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:08 compute-0 sudo[74014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvqlyuoxwlxfcxpbglceubbxdjylyhkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527908.6087668-853-205230932767635/AnsiballZ_stat.py'
Feb 19 19:05:08 compute-0 sudo[74014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:09 compute-0 python3.9[74017]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:05:09 compute-0 sudo[74014]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:09 compute-0 sudo[74138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swqbotzmcuktdgsckhosjnaxghgaoawl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527908.6087668-853-205230932767635/AnsiballZ_copy.py'
Feb 19 19:05:09 compute-0 sudo[74138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:09 compute-0 python3.9[74141]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527908.6087668-853-205230932767635/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:09 compute-0 sudo[74138]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:09 compute-0 sudo[74291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prgzunmffiifbvoentwktpejueaezvic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527909.788827-883-253130561191835/AnsiballZ_file.py'
Feb 19 19:05:09 compute-0 sudo[74291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:10 compute-0 python3.9[74294]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:10 compute-0 sudo[74291]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:10 compute-0 sudo[74444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dautwrmxpgqomdjxzynbwyskjllmksvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527910.359252-899-188047885962579/AnsiballZ_command.py'
Feb 19 19:05:10 compute-0 sudo[74444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:10 compute-0 python3.9[74447]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:05:10 compute-0 sudo[74444]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:11 compute-0 sudo[74604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuqddaoyiwoqitfnorvjqlcugicqajlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527911.0104566-915-111557362751528/AnsiballZ_blockinfile.py'
Feb 19 19:05:11 compute-0 sudo[74604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:11 compute-0 python3.9[74607]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:11 compute-0 sudo[74604]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:12 compute-0 sudo[74758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmawzvgcxensinovmebxtsrmbygsjhmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527911.887805-933-96825162145018/AnsiballZ_file.py'
Feb 19 19:05:12 compute-0 sudo[74758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:12 compute-0 python3.9[74761]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:12 compute-0 sudo[74758]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:12 compute-0 sudo[74911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-labvqrojshpmpzxoxxldjhggrboewfit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527912.3928246-933-239570379165134/AnsiballZ_file.py'
Feb 19 19:05:12 compute-0 sudo[74911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:12 compute-0 python3.9[74914]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:12 compute-0 sudo[74911]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:13 compute-0 sudo[75064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkjsgzpqivuavlhoqfekonlmttlipjsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527913.1009035-963-91084228505276/AnsiballZ_mount.py'
Feb 19 19:05:13 compute-0 sudo[75064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:13 compute-0 python3.9[75067]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 19 19:05:13 compute-0 sudo[75064]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:13 compute-0 rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 19:05:13 compute-0 rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 19:05:13 compute-0 sudo[75219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akiffjlxghmmksnezoeoerkmsmkiotjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527913.7481472-963-226219419357027/AnsiballZ_mount.py'
Feb 19 19:05:13 compute-0 sudo[75219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:14 compute-0 python3.9[75222]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 19 19:05:14 compute-0 sudo[75219]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:14 compute-0 sshd-session[65957]: Connection closed by 192.168.122.30 port 59654
Feb 19 19:05:14 compute-0 sshd-session[65954]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:05:14 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Feb 19 19:05:14 compute-0 systemd[1]: session-15.scope: Consumed 29.515s CPU time.
Feb 19 19:05:14 compute-0 systemd-logind[822]: Session 15 logged out. Waiting for processes to exit.
Feb 19 19:05:14 compute-0 systemd-logind[822]: Removed session 15.
Feb 19 19:05:20 compute-0 sshd-session[75248]: Accepted publickey for zuul from 192.168.122.30 port 58422 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:05:20 compute-0 systemd-logind[822]: New session 16 of user zuul.
Feb 19 19:05:20 compute-0 systemd[1]: Started Session 16 of User zuul.
Feb 19 19:05:20 compute-0 sshd-session[75248]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:05:20 compute-0 sudo[75401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twtujrwpdvtpqamuzyrjzqvjrekxlweh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527920.4142377-17-1970965109635/AnsiballZ_tempfile.py'
Feb 19 19:05:20 compute-0 sudo[75401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:20 compute-0 python3.9[75404]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 19 19:05:20 compute-0 sudo[75401]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:21 compute-0 sudo[75554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyohlsuocwmkmsbtcvrlyigchdnrrrvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527921.16453-41-139681820062170/AnsiballZ_stat.py'
Feb 19 19:05:21 compute-0 sudo[75554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:21 compute-0 python3.9[75557]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:05:21 compute-0 sudo[75554]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:22 compute-0 sudo[75707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnifcyfzcehsgxhydnbvlkycxmbcjhxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527921.9538007-61-23312492877171/AnsiballZ_setup.py'
Feb 19 19:05:22 compute-0 sudo[75707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:22 compute-0 python3.9[75710]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:05:22 compute-0 sudo[75707]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:23 compute-0 sudo[75860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvjdmdpbjwexibkmvlkrqcytzetwibte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527922.9369478-78-206765985046508/AnsiballZ_blockinfile.py'
Feb 19 19:05:23 compute-0 sudo[75860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:23 compute-0 python3.9[75863]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyFaihmVL/MaZICMlw4g14Nps4ZNlfGlpgWQ8uOSlLeuoOOyL3SQm/Bd2ei+d9q1rVwReG/DJaGBrXbQ3jhvWSCv1gZt6zRU1Vc8V7a6vFlhqkturmA+M/1XGPKY5AJgPlpsuhB7DjNuEQrWs2ZkxhoSH5sEPH+G7KZ3o7C7Fcpno/a8Ordg/5WgPmOvM0ZFiqkBRTFADIuknEi/FalAAqnp6bhl4fP78VY6A61rIwTLKdsznlUIMaxNh5uHT5SRwNeLQIAu8PAIcgyeiY5GiWDKJfAH8iXGYc6FNbJZjCiiAkXmrWyY7433prqnw++87hNOnkRvq/mA+OBQre8og8cgU64Vn794LbiF6A0pDKEdHhHWMcDmtowlnOFXUvYLGN/HuLgQjwHA45MY1UZvqRRtdBpR+aKo3ziXlAjCK16Zd6ZcDylxGOHidjoXcjsTZfYtFMhKX2Zol0CKO38ZKJOS5MT1VQJPgtPUnu9KScGbA6JpWdkaafOgzpPgD11TU=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGjxVu7fsbfEg7jgPiYdeSQKhE57RykKaKfA1U2cHZfh
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBB/F9TU/3W1xNUDH8Gv2RzYIWdjQuxbL4n9CY8n3awRr7rK+EhPU31bszhaPmm2F4eDx62iqbHhoUxE1gBWGKsg=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDGRPPsOETEzUBy2wqzE3RUhzgSAItsZSdgwmi04e8bBYAUmhNQIxRegTFkkAsxh+jtJ/bIQQM43iizE6RiAScye5N8l9BgaHLfR2D0BfHxxA4fycxCK9Mwvw1wz01OelIU7+/CAKNNf3RJlS5sqTXhrH7y8brCRQOscdSG6y8PUamCJrv7kANwSN7axTAHCe9/YgSj4xZ7J076b0RvRXuRAmvy59Ap6WHihHUcBxmymHBw6QI5Hulc8VxqXudrgR+NHmWdMWctNl7YrkcMmV6iebsv7p1r9J0PvMVH4c00RH2KyD5thLD2VsKL36OifKTQy3rUrRmOU7kh9YOIdYiDRunWzDDVqm1OPY8hSGkjQ357S7NzZmYrfm5GsuaAcp3/ENKEHu7gk8MIsuG9YytzwqaBYpqDdxv2AU+yjDA5Ef9TfikqRqxdboIpQlJBmBygAO3nrG0cdCSzBubDFD9koSaIoZVCWDoraSzgh27jfNvoZSa/efHQkFawR/JhNkU=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDrgENvK8j7bca8u1UHaM64/IxEaIDSD5KU85ZwnsHQj
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBC0t0KwRnzfFtjsf+8aFBlczUETCuAe0bMDiKYmyk+t4DjL/QKF6AHudh+G9WBZZ0JA9E83hxv4voCb/3u7h5do=
                                             create=True mode=0644 path=/tmp/ansible.nwdvbcb4 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:23 compute-0 sudo[75860]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:24 compute-0 sudo[76013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glslqrxueytngngnubjaajteivwfagju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527923.6767263-94-12235490180263/AnsiballZ_command.py'
Feb 19 19:05:24 compute-0 sudo[76013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:24 compute-0 python3.9[76016]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.nwdvbcb4' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:05:24 compute-0 sudo[76013]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:24 compute-0 sudo[76168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmymkfqslqrvxuetwazizdtckyjtvlza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527924.5235207-110-250495055675298/AnsiballZ_file.py'
Feb 19 19:05:24 compute-0 sudo[76168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:25 compute-0 python3.9[76171]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.nwdvbcb4 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:25 compute-0 sudo[76168]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:25 compute-0 sshd-session[75251]: Connection closed by 192.168.122.30 port 58422
Feb 19 19:05:25 compute-0 sshd-session[75248]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:05:25 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Feb 19 19:05:25 compute-0 systemd[1]: session-16.scope: Consumed 2.546s CPU time.
Feb 19 19:05:25 compute-0 systemd-logind[822]: Session 16 logged out. Waiting for processes to exit.
Feb 19 19:05:25 compute-0 systemd-logind[822]: Removed session 16.
Feb 19 19:05:27 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 19 19:05:30 compute-0 sshd-session[76199]: Accepted publickey for zuul from 192.168.122.30 port 49672 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:05:30 compute-0 systemd-logind[822]: New session 17 of user zuul.
Feb 19 19:05:30 compute-0 systemd[1]: Started Session 17 of User zuul.
Feb 19 19:05:30 compute-0 sshd-session[76199]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:05:31 compute-0 python3.9[76352]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:05:32 compute-0 sudo[76506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlxgfzaggkmsltxwfaaevkxlwiaijjgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527931.5197124-39-125484816219321/AnsiballZ_systemd.py'
Feb 19 19:05:32 compute-0 sudo[76506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:32 compute-0 python3.9[76509]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 19 19:05:32 compute-0 sudo[76506]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:33 compute-0 sudo[76661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihedavysktzjfsdumvmqxukdtxkmearn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527933.6647081-55-195713472904948/AnsiballZ_systemd.py'
Feb 19 19:05:33 compute-0 sudo[76661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:34 compute-0 python3.9[76664]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:05:34 compute-0 sudo[76661]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:34 compute-0 sudo[76815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmxnnwyhasspaujxwsmgymehljelgegq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527934.584964-73-167718991014567/AnsiballZ_command.py'
Feb 19 19:05:34 compute-0 sudo[76815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:35 compute-0 python3.9[76818]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:05:35 compute-0 sudo[76815]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:35 compute-0 sudo[76969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqiatsyijmahgcyllyhnrornsaudcidm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527935.3491066-89-128758696760833/AnsiballZ_stat.py'
Feb 19 19:05:35 compute-0 sudo[76969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:35 compute-0 python3.9[76972]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:05:35 compute-0 sudo[76969]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:36 compute-0 sudo[77124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvkngnifbjijadxxdiljkaahfdvlueud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527936.1106822-105-141209945040545/AnsiballZ_command.py'
Feb 19 19:05:36 compute-0 sudo[77124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:36 compute-0 python3.9[77127]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:05:36 compute-0 sudo[77124]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:37 compute-0 sudo[77280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiqdhgaqnrihoankjgosndyojbadlpgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527936.8287103-121-179642490922410/AnsiballZ_file.py'
Feb 19 19:05:37 compute-0 sudo[77280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:37 compute-0 python3.9[77283]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:05:37 compute-0 sudo[77280]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:37 compute-0 sshd-session[76202]: Connection closed by 192.168.122.30 port 49672
Feb 19 19:05:37 compute-0 sshd-session[76199]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:05:37 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Feb 19 19:05:37 compute-0 systemd[1]: session-17.scope: Consumed 3.485s CPU time.
Feb 19 19:05:37 compute-0 systemd-logind[822]: Session 17 logged out. Waiting for processes to exit.
Feb 19 19:05:37 compute-0 systemd-logind[822]: Removed session 17.
Feb 19 19:05:42 compute-0 sshd-session[77309]: Accepted publickey for zuul from 192.168.122.30 port 35890 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:05:42 compute-0 systemd-logind[822]: New session 18 of user zuul.
Feb 19 19:05:42 compute-0 systemd[1]: Started Session 18 of User zuul.
Feb 19 19:05:42 compute-0 sshd-session[77309]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:05:44 compute-0 python3.9[77462]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:05:44 compute-0 sudo[77616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaikgmcxoqvqjhnduehortmhupgujdml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527944.493488-43-40502194434615/AnsiballZ_setup.py'
Feb 19 19:05:44 compute-0 sudo[77616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:44 compute-0 python3.9[77619]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 19:05:45 compute-0 sudo[77616]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:45 compute-0 sudo[77701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qawtdeywdmepvpzjeopesukuzjnkcbvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527944.493488-43-40502194434615/AnsiballZ_dnf.py'
Feb 19 19:05:45 compute-0 sudo[77701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:05:45 compute-0 python3.9[77704]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 19 19:05:47 compute-0 sudo[77701]: pam_unix(sudo:session): session closed for user root
Feb 19 19:05:47 compute-0 python3.9[77855]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:05:48 compute-0 python3.9[78006]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 19 19:05:49 compute-0 python3.9[78156]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:05:50 compute-0 python3.9[78306]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:05:50 compute-0 sshd-session[77312]: Connection closed by 192.168.122.30 port 35890
Feb 19 19:05:50 compute-0 sshd-session[77309]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:05:50 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Feb 19 19:05:50 compute-0 systemd[1]: session-18.scope: Consumed 5.146s CPU time.
Feb 19 19:05:50 compute-0 systemd-logind[822]: Session 18 logged out. Waiting for processes to exit.
Feb 19 19:05:50 compute-0 systemd-logind[822]: Removed session 18.
Feb 19 19:05:57 compute-0 sshd-session[78331]: Accepted publickey for zuul from 192.168.122.30 port 52636 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:05:57 compute-0 systemd-logind[822]: New session 19 of user zuul.
Feb 19 19:05:57 compute-0 systemd[1]: Started Session 19 of User zuul.
Feb 19 19:05:57 compute-0 sshd-session[78331]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:05:58 compute-0 python3.9[78484]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:06:00 compute-0 sudo[78638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xybhtmafwrtsyjbbgsobrjvmfardildv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527959.7804718-77-111121702977141/AnsiballZ_file.py'
Feb 19 19:06:00 compute-0 sudo[78638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:00 compute-0 python3.9[78641]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:00 compute-0 sudo[78638]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:00 compute-0 sudo[78791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptwdppmnvtklwkecrhgxicaucyrpatcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527960.452615-77-91515974571901/AnsiballZ_file.py'
Feb 19 19:06:00 compute-0 sudo[78791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:00 compute-0 python3.9[78794]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:00 compute-0 sudo[78791]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:01 compute-0 sudo[78944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvcqgniqbfkrsqrihcgasgpakiscyoba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527960.952939-104-31809174486127/AnsiballZ_stat.py'
Feb 19 19:06:01 compute-0 sudo[78944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:01 compute-0 python3.9[78947]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:01 compute-0 sudo[78944]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:01 compute-0 sudo[79068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwomsomsfcsuihzhwtijdezftnymterj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527960.952939-104-31809174486127/AnsiballZ_copy.py'
Feb 19 19:06:01 compute-0 sudo[79068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:02 compute-0 python3.9[79071]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527960.952939-104-31809174486127/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=e4b8c9fe29b753ba85a6aa4a4589382416a506d4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:02 compute-0 sudo[79068]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:02 compute-0 sudo[79221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwuwzzyqtzrpxnlpklgnmxkkgfacqbjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527962.2514143-104-149736780606637/AnsiballZ_stat.py'
Feb 19 19:06:02 compute-0 sudo[79221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:02 compute-0 python3.9[79224]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:02 compute-0 sudo[79221]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:02 compute-0 sudo[79345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thympcjficjaxprqwbapfwziioncnfwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527962.2514143-104-149736780606637/AnsiballZ_copy.py'
Feb 19 19:06:02 compute-0 sudo[79345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:03 compute-0 python3.9[79348]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527962.2514143-104-149736780606637/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=55a8055996d049d79cf4d753f57cc5f0c6eed2d2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:03 compute-0 sudo[79345]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:03 compute-0 sudo[79498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iciwgpsqjodudatfyszfcsncblvxguen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527963.3077507-104-138091501730244/AnsiballZ_stat.py'
Feb 19 19:06:03 compute-0 sudo[79498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:03 compute-0 python3.9[79501]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:03 compute-0 sudo[79498]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:04 compute-0 sudo[79622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzdkqnlyzmcyqqvdcmdqmvmjslujdzwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527963.3077507-104-138091501730244/AnsiballZ_copy.py'
Feb 19 19:06:04 compute-0 sudo[79622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:04 compute-0 python3.9[79625]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527963.3077507-104-138091501730244/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=2e735fb65e427c5e93d05caa7f111148da38b30c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:04 compute-0 sudo[79622]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:04 compute-0 sudo[79775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deljlnbymlgocygkiobyqrzhkuppspdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527964.4959116-192-270840029307009/AnsiballZ_file.py'
Feb 19 19:06:04 compute-0 sudo[79775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:04 compute-0 python3.9[79778]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:04 compute-0 sudo[79775]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:05 compute-0 sudo[79928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niemjdtgllggwjbzcwzzzraywoazynni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527965.0836465-192-260587659415087/AnsiballZ_file.py'
Feb 19 19:06:05 compute-0 sudo[79928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:05 compute-0 python3.9[79931]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:05 compute-0 sudo[79928]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:05 compute-0 sudo[80081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqdaxbnkzbkbjrbyhoxtrcnoxxoiitmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527965.6967509-223-111618760610554/AnsiballZ_stat.py'
Feb 19 19:06:05 compute-0 sudo[80081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:06 compute-0 python3.9[80084]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:06 compute-0 sudo[80081]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:06 compute-0 sudo[80205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dflxxlrldknbaosowaxofzxyyzfgtqvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527965.6967509-223-111618760610554/AnsiballZ_copy.py'
Feb 19 19:06:06 compute-0 sudo[80205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:06 compute-0 python3.9[80208]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527965.6967509-223-111618760610554/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=64f53c6ba02d637de674d571dc11f40b0a42b7bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:06 compute-0 sudo[80205]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:06 compute-0 sudo[80358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofmjxjczvjwwblzluizbnrvdhggraoqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527966.6271632-223-164928744031865/AnsiballZ_stat.py'
Feb 19 19:06:06 compute-0 sudo[80358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:07 compute-0 python3.9[80361]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:07 compute-0 sudo[80358]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:07 compute-0 sudo[80482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpjlnnexhuwinpqqshonxklwjpgsnvzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527966.6271632-223-164928744031865/AnsiballZ_copy.py'
Feb 19 19:06:07 compute-0 sudo[80482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:07 compute-0 python3.9[80485]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527966.6271632-223-164928744031865/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c062c9cef5ac35af801bf2fa5b5bf454dba7d02a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:07 compute-0 sudo[80482]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:07 compute-0 sudo[80635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrzjhyvcfgkbuivokatkvfpelbmqqxli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527967.6146412-223-96290918506642/AnsiballZ_stat.py'
Feb 19 19:06:07 compute-0 sudo[80635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:08 compute-0 python3.9[80638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:08 compute-0 sudo[80635]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:08 compute-0 sudo[80759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyvnurrceimviqypnoupvjhnzpltstpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527967.6146412-223-96290918506642/AnsiballZ_copy.py'
Feb 19 19:06:08 compute-0 sudo[80759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:08 compute-0 python3.9[80762]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527967.6146412-223-96290918506642/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=2fb19a9697d7ae484b05a6406a519ac5e641273c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:08 compute-0 sudo[80759]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:08 compute-0 sudo[80912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hustemjdruojuypoqzwoqybrefpbfslo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527968.761002-307-92640906725037/AnsiballZ_file.py'
Feb 19 19:06:08 compute-0 sudo[80912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:09 compute-0 python3.9[80915]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:09 compute-0 sudo[80912]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:09 compute-0 sudo[81065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfijqonaztmvmembntnbdgqlyychtzeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527969.335872-307-19711351518817/AnsiballZ_file.py'
Feb 19 19:06:09 compute-0 sudo[81065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:09 compute-0 python3.9[81068]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:09 compute-0 sudo[81065]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:10 compute-0 sudo[81218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqcyhayvnfkbxnewzibswhuqqqppccfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527969.9740996-336-220599834242715/AnsiballZ_stat.py'
Feb 19 19:06:10 compute-0 sudo[81218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:10 compute-0 python3.9[81221]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:10 compute-0 sudo[81218]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:10 compute-0 sudo[81342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwhyzmjrpiucyfvroypqvgslhtbfhjrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527969.9740996-336-220599834242715/AnsiballZ_copy.py'
Feb 19 19:06:10 compute-0 sudo[81342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:10 compute-0 python3.9[81345]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527969.9740996-336-220599834242715/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=51c9099c9be8787ad8feeb5d6eb4f6ce58ece01d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:10 compute-0 sudo[81342]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:11 compute-0 sudo[81495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adfgjmifchafiqyzjdkxervdjqyjfnra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527971.099876-336-24698253801014/AnsiballZ_stat.py'
Feb 19 19:06:11 compute-0 sudo[81495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:11 compute-0 python3.9[81498]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:11 compute-0 sudo[81495]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:11 compute-0 sudo[81619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kobvxrmujrlcvmqlejwxwykudagqrbgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527971.099876-336-24698253801014/AnsiballZ_copy.py'
Feb 19 19:06:11 compute-0 sudo[81619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:11 compute-0 python3.9[81622]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527971.099876-336-24698253801014/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=a038ce94f26fbf98c356fedaf9160229d7fc152c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:11 compute-0 sudo[81619]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:12 compute-0 sudo[81772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xohiatkpfgrqtyeelnudgilfidgndbdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527972.0543451-336-85193430293153/AnsiballZ_stat.py'
Feb 19 19:06:12 compute-0 sudo[81772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:12 compute-0 python3.9[81775]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:12 compute-0 sudo[81772]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:12 compute-0 sudo[81896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtjsqygbuttjfpuflscmqrddhqjeaxva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527972.0543451-336-85193430293153/AnsiballZ_copy.py'
Feb 19 19:06:12 compute-0 sudo[81896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:12 compute-0 python3.9[81899]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527972.0543451-336-85193430293153/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=36e992e0d24570337bac86983b3d85c043fb73a0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:13 compute-0 sudo[81896]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:13 compute-0 sudo[82049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afrdoqidfeyivdaworfhxqgkijynafhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527973.2019866-420-79664804952494/AnsiballZ_file.py'
Feb 19 19:06:13 compute-0 sudo[82049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:13 compute-0 python3.9[82052]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:13 compute-0 sudo[82049]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:14 compute-0 sudo[82202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lygarpueifrwxdqlgimnirzcczwtilaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527973.7532027-420-180724932639345/AnsiballZ_file.py'
Feb 19 19:06:14 compute-0 sudo[82202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:14 compute-0 python3.9[82205]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:14 compute-0 sudo[82202]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:14 compute-0 sudo[82355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piuhsswpyxqsoyonstrfjmlfbjyjvfgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527974.346881-451-85210356899117/AnsiballZ_stat.py'
Feb 19 19:06:14 compute-0 sudo[82355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:14 compute-0 python3.9[82358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:14 compute-0 sudo[82355]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:15 compute-0 sudo[82479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcgisfxlssysseglpadsleojqbetmpal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527974.346881-451-85210356899117/AnsiballZ_copy.py'
Feb 19 19:06:15 compute-0 sudo[82479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:15 compute-0 python3.9[82482]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527974.346881-451-85210356899117/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=bfdab316b82c2114348fc3e88f37bc72be6370ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:15 compute-0 sudo[82479]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:15 compute-0 sudo[82632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snjudgmasbwnvyagrejngwzjvjpizshc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527975.3175228-451-195435142601485/AnsiballZ_stat.py'
Feb 19 19:06:15 compute-0 sudo[82632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:15 compute-0 python3.9[82635]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:15 compute-0 sudo[82632]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:16 compute-0 sudo[82756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqwlxxlgazhwvzgpqrcyuecmkifdjjbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527975.3175228-451-195435142601485/AnsiballZ_copy.py'
Feb 19 19:06:16 compute-0 sudo[82756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:16 compute-0 python3.9[82759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527975.3175228-451-195435142601485/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=a038ce94f26fbf98c356fedaf9160229d7fc152c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:16 compute-0 sudo[82756]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:16 compute-0 sudo[82909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdcqvskedetgmvueqsfyuhkhlljdjyna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527976.3564515-451-215085217959337/AnsiballZ_stat.py'
Feb 19 19:06:16 compute-0 sudo[82909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:16 compute-0 python3.9[82912]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:16 compute-0 sudo[82909]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:16 compute-0 sudo[83033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejkhqxcswewrdoldbrujxpcqtlgovvwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527976.3564515-451-215085217959337/AnsiballZ_copy.py'
Feb 19 19:06:16 compute-0 sudo[83033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:17 compute-0 python3.9[83036]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527976.3564515-451-215085217959337/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=c81695c2be971f9f060caf7592c09859e34d802c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:17 compute-0 sudo[83033]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:17 compute-0 sudo[83186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyovqrkhnkruwnelcsfljblhjgdjthlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527977.754356-554-194485702664846/AnsiballZ_file.py'
Feb 19 19:06:17 compute-0 sudo[83186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:18 compute-0 python3.9[83189]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:18 compute-0 sudo[83186]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:18 compute-0 sudo[83339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diurmupjmwnoxasdnpjaceddyobwhrzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527978.3156257-569-181956176862644/AnsiballZ_stat.py'
Feb 19 19:06:18 compute-0 sudo[83339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:18 compute-0 python3.9[83342]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:18 compute-0 sudo[83339]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:18 compute-0 sudo[83463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myieijizqqwknybuzzbhehczemtabqkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527978.3156257-569-181956176862644/AnsiballZ_copy.py'
Feb 19 19:06:18 compute-0 sudo[83463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:19 compute-0 python3.9[83466]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527978.3156257-569-181956176862644/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d5e185114d46c25488692b53a20c24df98df9ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:19 compute-0 sudo[83463]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:19 compute-0 sudo[83616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjnpfjuezjsnijmnpovlooizoletnami ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527979.25589-609-30511246402281/AnsiballZ_file.py'
Feb 19 19:06:19 compute-0 sudo[83616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:19 compute-0 python3.9[83619]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:19 compute-0 sudo[83616]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:19 compute-0 sudo[83769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siieovqmodvpywihsfvlsrfsozxoiytk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527979.7625136-624-92361265874255/AnsiballZ_stat.py'
Feb 19 19:06:19 compute-0 sudo[83769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:20 compute-0 python3.9[83772]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:20 compute-0 sudo[83769]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:20 compute-0 sudo[83893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrxobyswxstmxpxzvtglixfsixhdsrir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527979.7625136-624-92361265874255/AnsiballZ_copy.py'
Feb 19 19:06:20 compute-0 sudo[83893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:20 compute-0 python3.9[83896]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527979.7625136-624-92361265874255/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d5e185114d46c25488692b53a20c24df98df9ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:20 compute-0 sudo[83893]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:21 compute-0 sudo[84046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfbotejuzbvdzwdmrkmcdfzlgeyivbas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527980.8545587-653-195676260134975/AnsiballZ_file.py'
Feb 19 19:06:21 compute-0 sudo[84046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:21 compute-0 python3.9[84049]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:21 compute-0 sudo[84046]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:21 compute-0 sudo[84199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isdaalxvgzjimqtvlgohmgmytlrsdjyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527981.3875213-666-32939691831472/AnsiballZ_stat.py'
Feb 19 19:06:21 compute-0 sudo[84199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:21 compute-0 python3.9[84202]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:21 compute-0 sudo[84199]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:22 compute-0 sudo[84323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqdwkurkekbzxyeaxkwmmfdeqzsaaijn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527981.3875213-666-32939691831472/AnsiballZ_copy.py'
Feb 19 19:06:22 compute-0 sudo[84323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:22 compute-0 python3.9[84326]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527981.3875213-666-32939691831472/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d5e185114d46c25488692b53a20c24df98df9ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:22 compute-0 sudo[84323]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:22 compute-0 sudo[84476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egczzzeaihuxjhfxpkkuqqtmdzulwmhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527982.4655209-697-152096260386330/AnsiballZ_file.py'
Feb 19 19:06:22 compute-0 sudo[84476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:22 compute-0 python3.9[84479]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:22 compute-0 sudo[84476]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:23 compute-0 sudo[84629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcbwaxverwbolfmqrzineqfyjlwnsjfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527982.952257-714-215531932698185/AnsiballZ_stat.py'
Feb 19 19:06:23 compute-0 sudo[84629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:23 compute-0 python3.9[84632]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:23 compute-0 sudo[84629]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:23 compute-0 sudo[84753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdfnfrpvgjvaqgapupreuhhjxzbfhusb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527982.952257-714-215531932698185/AnsiballZ_copy.py'
Feb 19 19:06:23 compute-0 sudo[84753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:23 compute-0 python3.9[84756]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527982.952257-714-215531932698185/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d5e185114d46c25488692b53a20c24df98df9ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:23 compute-0 sudo[84753]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:24 compute-0 sudo[84906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtxfvjfjgptajbuxaltccdwbymkuxhgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527983.973891-741-49463244507162/AnsiballZ_file.py'
Feb 19 19:06:24 compute-0 sudo[84906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:24 compute-0 python3.9[84909]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:24 compute-0 sudo[84906]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:24 compute-0 sudo[85059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofoxnpmaqnzzfitqwmyqvgwzovsksffx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527984.5244515-755-23474126752687/AnsiballZ_stat.py'
Feb 19 19:06:24 compute-0 sudo[85059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:24 compute-0 python3.9[85062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:24 compute-0 sudo[85059]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:25 compute-0 sudo[85183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kllkekirjfodxrmzzkfjfylrvjherfyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527984.5244515-755-23474126752687/AnsiballZ_copy.py'
Feb 19 19:06:25 compute-0 sudo[85183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:25 compute-0 python3.9[85186]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527984.5244515-755-23474126752687/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d5e185114d46c25488692b53a20c24df98df9ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:25 compute-0 sudo[85183]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:25 compute-0 sudo[85336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bykctnkjkrivyhrdajauuivlhwvjdtxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527985.658973-786-161784919857337/AnsiballZ_file.py'
Feb 19 19:06:25 compute-0 sudo[85336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:26 compute-0 python3.9[85339]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:26 compute-0 sudo[85336]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:26 compute-0 sudo[85489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtnihqkndrzyxlmrsxlkrdpbybwmmein ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527986.2401502-801-149045891651621/AnsiballZ_stat.py'
Feb 19 19:06:26 compute-0 sudo[85489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:26 compute-0 python3.9[85492]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:26 compute-0 sudo[85489]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:26 compute-0 sudo[85613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcnfiuvedjlowgpzwbmynosnwbosuifz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527986.2401502-801-149045891651621/AnsiballZ_copy.py'
Feb 19 19:06:26 compute-0 sudo[85613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:27 compute-0 python3.9[85616]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527986.2401502-801-149045891651621/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d5e185114d46c25488692b53a20c24df98df9ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:27 compute-0 sudo[85613]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:27 compute-0 sudo[85766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcgulonrhqhtvtsjraczbwdfpazgxbak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527987.2138948-826-200995559563258/AnsiballZ_file.py'
Feb 19 19:06:27 compute-0 sudo[85766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:27 compute-0 python3.9[85769]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:27 compute-0 sudo[85766]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:27 compute-0 sudo[85919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chonxisnaawlljmhfbbvpbptycdgfmmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527987.7596428-842-251121171365822/AnsiballZ_stat.py'
Feb 19 19:06:27 compute-0 sudo[85919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:28 compute-0 python3.9[85922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:28 compute-0 sudo[85919]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:28 compute-0 sudo[86043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvawzmuhbtbixrkkgqczmnzbmqhmyqps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527987.7596428-842-251121171365822/AnsiballZ_copy.py'
Feb 19 19:06:28 compute-0 sudo[86043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:28 compute-0 python3.9[86046]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771527987.7596428-842-251121171365822/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d5e185114d46c25488692b53a20c24df98df9ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:28 compute-0 sudo[86043]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:29 compute-0 chronyd[65928]: Selected source 167.160.187.179 (pool.ntp.org)
Feb 19 19:06:32 compute-0 sshd-session[78334]: Connection closed by 192.168.122.30 port 52636
Feb 19 19:06:32 compute-0 sshd-session[78331]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:06:32 compute-0 systemd-logind[822]: Session 19 logged out. Waiting for processes to exit.
Feb 19 19:06:32 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Feb 19 19:06:32 compute-0 systemd[1]: session-19.scope: Consumed 22.921s CPU time.
Feb 19 19:06:32 compute-0 systemd-logind[822]: Removed session 19.
Feb 19 19:06:38 compute-0 sshd-session[86073]: Accepted publickey for zuul from 192.168.122.30 port 37324 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:06:38 compute-0 systemd-logind[822]: New session 20 of user zuul.
Feb 19 19:06:38 compute-0 systemd[1]: Started Session 20 of User zuul.
Feb 19 19:06:38 compute-0 sshd-session[86073]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:06:38 compute-0 sshd-session[86071]: Invalid user claude from 138.255.157.62 port 11576
Feb 19 19:06:38 compute-0 sshd-session[86071]: Received disconnect from 138.255.157.62 port 11576:11: Bye Bye [preauth]
Feb 19 19:06:38 compute-0 sshd-session[86071]: Disconnected from invalid user claude 138.255.157.62 port 11576 [preauth]
Feb 19 19:06:39 compute-0 python3.9[86226]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:06:39 compute-0 sudo[86380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eprkejvbztyokmqsdwrehoeodedbnucc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771527999.5899587-43-242011982093494/AnsiballZ_file.py'
Feb 19 19:06:39 compute-0 sudo[86380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:40 compute-0 python3.9[86383]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:40 compute-0 sudo[86380]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:40 compute-0 sudo[86533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvldhkjwgckoiymwtfrcliehkxeqoerd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528000.22027-43-236157717389526/AnsiballZ_file.py'
Feb 19 19:06:40 compute-0 sudo[86533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:40 compute-0 python3.9[86536]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:06:40 compute-0 sudo[86533]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:41 compute-0 python3.9[86686]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:06:41 compute-0 sudo[86836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jilofizukrioxvrupslnphrbskqnftdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528001.6073327-89-209777256745130/AnsiballZ_seboolean.py'
Feb 19 19:06:41 compute-0 sudo[86836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:42 compute-0 python3.9[86839]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 19 19:06:43 compute-0 sudo[86836]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:43 compute-0 sudo[86993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrmrwxylrqbbfrmxujnwmokfdscurlig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528003.471685-109-240802061397339/AnsiballZ_setup.py'
Feb 19 19:06:43 compute-0 dbus-broker-launch[804]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Feb 19 19:06:43 compute-0 sudo[86993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:43 compute-0 python3.9[86996]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 19:06:44 compute-0 sudo[86993]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:44 compute-0 sudo[87078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qefdpjwfywltaxjgxlyiitiiwsbrjzzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528003.471685-109-240802061397339/AnsiballZ_dnf.py'
Feb 19 19:06:44 compute-0 sudo[87078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:44 compute-0 python3.9[87081]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:06:46 compute-0 sudo[87078]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:46 compute-0 sudo[87232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wachjlakeisdeiaasxnybgyvodydxyvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528006.2903376-133-89058538886783/AnsiballZ_systemd.py'
Feb 19 19:06:46 compute-0 sudo[87232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:47 compute-0 python3.9[87235]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 19 19:06:47 compute-0 sudo[87232]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:47 compute-0 sudo[87388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnmizfhxojyxqvskepchrndhcqwnfzjp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771528007.3651085-149-25449629873829/AnsiballZ_edpm_nftables_snippet.py'
Feb 19 19:06:47 compute-0 sudo[87388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:47 compute-0 python3[87391]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Feb 19 19:06:47 compute-0 sudo[87388]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:48 compute-0 sudo[87543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vararbjkzdlqgeifxgympvfbzmltojqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528008.2296937-167-208787246656160/AnsiballZ_file.py'
Feb 19 19:06:48 compute-0 sudo[87543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:48 compute-0 python3.9[87546]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:48 compute-0 sudo[87543]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:49 compute-0 sudo[87696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eecovsxkfgyjicvedjrvrafapzbponwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528008.897739-183-46137299148325/AnsiballZ_stat.py'
Feb 19 19:06:49 compute-0 sudo[87696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:49 compute-0 sshd-session[87392]: Received disconnect from 27.50.25.190 port 53078:11: Bye Bye [preauth]
Feb 19 19:06:49 compute-0 sshd-session[87392]: Disconnected from authenticating user root 27.50.25.190 port 53078 [preauth]
Feb 19 19:06:49 compute-0 python3.9[87699]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:49 compute-0 sudo[87696]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:49 compute-0 sudo[87775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hohtgtrvdvjnrrhyretrtmciyenoxugz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528008.897739-183-46137299148325/AnsiballZ_file.py'
Feb 19 19:06:49 compute-0 sudo[87775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:49 compute-0 python3.9[87778]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:49 compute-0 sudo[87775]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:50 compute-0 sudo[87928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kchdmflwhsyqmlmhxhhnhplexkwnithm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528010.2155013-207-138411084900571/AnsiballZ_stat.py'
Feb 19 19:06:50 compute-0 sudo[87928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:50 compute-0 python3.9[87931]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:50 compute-0 sudo[87928]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:50 compute-0 sudo[88007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agsfenlsudboaodbakibrkloltrphlgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528010.2155013-207-138411084900571/AnsiballZ_file.py'
Feb 19 19:06:50 compute-0 sudo[88007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:51 compute-0 python3.9[88010]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.8a1s87hj recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:51 compute-0 sudo[88007]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:51 compute-0 sudo[88160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bofqnykyzagjazgxcxjygpgjbljgzfzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528011.4533942-231-27484638815020/AnsiballZ_stat.py'
Feb 19 19:06:51 compute-0 sudo[88160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:51 compute-0 python3.9[88163]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:51 compute-0 sudo[88160]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:52 compute-0 sudo[88239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpsvaunmqpboqawzwbunotzdeoupooqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528011.4533942-231-27484638815020/AnsiballZ_file.py'
Feb 19 19:06:52 compute-0 sudo[88239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:52 compute-0 python3.9[88242]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:52 compute-0 sudo[88239]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:53 compute-0 sudo[88392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhlqfeejhetxxyflvjwuzoqthlgijndz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528012.6087112-257-230786063727213/AnsiballZ_command.py'
Feb 19 19:06:53 compute-0 sudo[88392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:53 compute-0 python3.9[88395]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:06:53 compute-0 sudo[88392]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:53 compute-0 sudo[88546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbcqbmhertboqvgwcfpopxlofhjhupxj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771528013.4467967-273-61646323600264/AnsiballZ_edpm_nftables_from_files.py'
Feb 19 19:06:53 compute-0 sudo[88546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:54 compute-0 python3[88549]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 19 19:06:54 compute-0 sudo[88546]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:54 compute-0 sudo[88699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dadmzkgpbdxfbjzbieipntjlqnenlqas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528014.669002-289-139895174521149/AnsiballZ_stat.py'
Feb 19 19:06:54 compute-0 sudo[88699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:55 compute-0 python3.9[88702]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:55 compute-0 sudo[88699]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:55 compute-0 sudo[88825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xagxrselufuulrhzbnafhavikztacrsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528014.669002-289-139895174521149/AnsiballZ_copy.py'
Feb 19 19:06:55 compute-0 sudo[88825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:55 compute-0 python3.9[88828]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528014.669002-289-139895174521149/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:55 compute-0 sudo[88825]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:56 compute-0 sudo[88978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbbvmfqzysbmgbfxbvwbegpgvqikjaec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528016.0299938-319-100599213118065/AnsiballZ_stat.py'
Feb 19 19:06:56 compute-0 sudo[88978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:56 compute-0 python3.9[88981]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:56 compute-0 sudo[88978]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:56 compute-0 sudo[89104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diumdjkkfmyadskgkhwlwcuowrksyvkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528016.0299938-319-100599213118065/AnsiballZ_copy.py'
Feb 19 19:06:56 compute-0 sudo[89104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:57 compute-0 python3.9[89107]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528016.0299938-319-100599213118065/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:57 compute-0 sudo[89104]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:57 compute-0 sudo[89257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkmlvjturwlmvhixutksfvriqwjffkqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528017.319804-349-273006856449586/AnsiballZ_stat.py'
Feb 19 19:06:57 compute-0 sudo[89257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:57 compute-0 python3.9[89260]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:57 compute-0 sudo[89257]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:58 compute-0 sudo[89383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lytvofhsduwtjutyeuqtivhdzowflmrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528017.319804-349-273006856449586/AnsiballZ_copy.py'
Feb 19 19:06:58 compute-0 sudo[89383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:58 compute-0 python3.9[89386]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528017.319804-349-273006856449586/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:58 compute-0 sudo[89383]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:58 compute-0 sudo[89536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhpknsuxpbegnufdbccsbndndenlraln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528018.6023047-379-257300750384182/AnsiballZ_stat.py'
Feb 19 19:06:58 compute-0 sudo[89536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:59 compute-0 python3.9[89539]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:06:59 compute-0 sudo[89536]: pam_unix(sudo:session): session closed for user root
Feb 19 19:06:59 compute-0 sudo[89662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuesfhakqhcdtexretismaddcnhpbctb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528018.6023047-379-257300750384182/AnsiballZ_copy.py'
Feb 19 19:06:59 compute-0 sudo[89662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:06:59 compute-0 python3.9[89665]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528018.6023047-379-257300750384182/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:06:59 compute-0 sudo[89662]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:00 compute-0 sudo[89815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eniravzznshiomqoqmqocpezlouoheoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528019.8474789-409-188483988247216/AnsiballZ_stat.py'
Feb 19 19:07:00 compute-0 sudo[89815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:00 compute-0 python3.9[89818]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:07:00 compute-0 sudo[89815]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:00 compute-0 sudo[89941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmfblxbylvwricdfmctnkvnwiraqbwma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528019.8474789-409-188483988247216/AnsiballZ_copy.py'
Feb 19 19:07:00 compute-0 sudo[89941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:00 compute-0 python3.9[89944]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528019.8474789-409-188483988247216/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:00 compute-0 sudo[89941]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:01 compute-0 sudo[90094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqqnbmimmadopvtruxlbjlucbgmvbyqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528021.020449-439-166892477170474/AnsiballZ_file.py'
Feb 19 19:07:01 compute-0 sudo[90094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:01 compute-0 python3.9[90097]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:01 compute-0 sudo[90094]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:01 compute-0 sudo[90247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pggrcumviybjjsagqxmsxtrenjjmgpnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528021.6292078-455-208396732733493/AnsiballZ_command.py'
Feb 19 19:07:01 compute-0 sudo[90247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:02 compute-0 python3.9[90250]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:07:02 compute-0 sudo[90247]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:02 compute-0 sudo[90403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvnwffwgliqbysdwqkfkihjtrfolswim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528022.2666628-471-2417106977402/AnsiballZ_blockinfile.py'
Feb 19 19:07:02 compute-0 sudo[90403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:02 compute-0 python3.9[90406]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:02 compute-0 sudo[90403]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:03 compute-0 sudo[90556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pltvfxvoyltskfvydjrqvgfiqkvhrfls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528023.152969-489-39024055701779/AnsiballZ_command.py'
Feb 19 19:07:03 compute-0 sudo[90556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:03 compute-0 python3.9[90559]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:07:03 compute-0 sudo[90556]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:03 compute-0 sudo[90710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwdrvynpxnuhronfgubczhsraueritpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528023.7639492-505-85862035687632/AnsiballZ_stat.py'
Feb 19 19:07:03 compute-0 sudo[90710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:04 compute-0 python3.9[90713]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:07:04 compute-0 sudo[90710]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:04 compute-0 sudo[90865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeaektqxfzswqibuyxrtjaknelxqczgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528024.3465536-521-13681976849268/AnsiballZ_command.py'
Feb 19 19:07:04 compute-0 sudo[90865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:04 compute-0 python3.9[90868]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:07:04 compute-0 sudo[90865]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:05 compute-0 sudo[91021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nccyuuyrdueikqdjdjouwmdmlcceteow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528024.9652274-537-6188302634060/AnsiballZ_file.py'
Feb 19 19:07:05 compute-0 sudo[91021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:05 compute-0 python3.9[91024]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:05 compute-0 sudo[91021]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:06 compute-0 python3.9[91174]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:07:07 compute-0 sudo[91325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xablcsqyxkdwtgvjuwihlwadhihwqcup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528027.3340864-619-281277290740081/AnsiballZ_command.py'
Feb 19 19:07:07 compute-0 sudo[91325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:07 compute-0 python3.9[91328]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:2f:db:26:37" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:07:07 compute-0 ovs-vsctl[91329]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:2f:db:26:37 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Feb 19 19:07:07 compute-0 sudo[91325]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:08 compute-0 sudo[91481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fftjcxtzzprnzejjouuxikvvubpqhgny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528027.967887-637-98489987430407/AnsiballZ_command.py'
Feb 19 19:07:08 compute-0 sudo[91481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:08 compute-0 sshd-session[91437]: Invalid user shreyas from 43.166.137.151 port 39960
Feb 19 19:07:08 compute-0 sshd-session[91437]: Received disconnect from 43.166.137.151 port 39960:11: Bye Bye [preauth]
Feb 19 19:07:08 compute-0 sshd-session[91437]: Disconnected from invalid user shreyas 43.166.137.151 port 39960 [preauth]
Feb 19 19:07:08 compute-0 python3.9[91484]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:07:08 compute-0 sudo[91481]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:09 compute-0 sudo[91637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ockwoeaajrtejzgxmiwfrtncgfhqakhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528028.7148447-653-147560245032147/AnsiballZ_command.py'
Feb 19 19:07:09 compute-0 sudo[91637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:09 compute-0 python3.9[91640]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:07:09 compute-0 ovs-vsctl[91641]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Feb 19 19:07:09 compute-0 sudo[91637]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:09 compute-0 python3.9[91791]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:07:10 compute-0 sudo[91943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zipcowfppxymfaxllnatjfvmchyqiteq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528030.2117333-687-231544958013282/AnsiballZ_file.py'
Feb 19 19:07:10 compute-0 sudo[91943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:10 compute-0 python3.9[91946]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:07:10 compute-0 sudo[91943]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:11 compute-0 sudo[92096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnhelpastfrnbwnkqftyiokzjsretmtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528031.0347958-703-227435286954618/AnsiballZ_stat.py'
Feb 19 19:07:11 compute-0 sudo[92096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:11 compute-0 python3.9[92099]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:07:11 compute-0 sudo[92096]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:11 compute-0 sudo[92175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksriiuszmjbpcufdwbwbodihqmerdxas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528031.0347958-703-227435286954618/AnsiballZ_file.py'
Feb 19 19:07:11 compute-0 sudo[92175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:11 compute-0 python3.9[92178]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:07:11 compute-0 sudo[92175]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:12 compute-0 sudo[92328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lextzbyhruxommkieqrlrqwgujaahnxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528031.9224336-703-42766554028577/AnsiballZ_stat.py'
Feb 19 19:07:12 compute-0 sudo[92328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:12 compute-0 python3.9[92331]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:07:12 compute-0 sudo[92328]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:12 compute-0 sudo[92407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsudsexipntdwkxgkqdjwzyahckxprzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528031.9224336-703-42766554028577/AnsiballZ_file.py'
Feb 19 19:07:12 compute-0 sudo[92407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:12 compute-0 python3.9[92410]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:07:12 compute-0 sudo[92407]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:13 compute-0 sudo[92560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fojeafvspjtbmgbpklrxihvhzwtvrusx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528033.2405128-749-223426373809994/AnsiballZ_file.py'
Feb 19 19:07:13 compute-0 sudo[92560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:13 compute-0 python3.9[92563]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:13 compute-0 sudo[92560]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:14 compute-0 sudo[92713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gorpyzresxdjlqhbudwuomfdavghjqwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528033.8890026-765-78790089684323/AnsiballZ_stat.py'
Feb 19 19:07:14 compute-0 sudo[92713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:14 compute-0 python3.9[92716]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:07:14 compute-0 sudo[92713]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:14 compute-0 sudo[92792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwyfqngqambwlrzxyedpytazblonurgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528033.8890026-765-78790089684323/AnsiballZ_file.py'
Feb 19 19:07:14 compute-0 sudo[92792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:14 compute-0 python3.9[92795]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:14 compute-0 sudo[92792]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:15 compute-0 sudo[92945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-petmxyqtdzfpwdaiecdkkqgoscvohhne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528034.9977734-789-237692726303035/AnsiballZ_stat.py'
Feb 19 19:07:15 compute-0 sudo[92945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:15 compute-0 python3.9[92948]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:07:15 compute-0 sudo[92945]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:15 compute-0 sudo[93024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byiikehienjjeqpkzudmimbzmjglrkqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528034.9977734-789-237692726303035/AnsiballZ_file.py'
Feb 19 19:07:15 compute-0 sudo[93024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:15 compute-0 python3.9[93027]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:15 compute-0 sudo[93024]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:16 compute-0 sudo[93177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfxbxwdcimrgxmjcbfbeuyaxmvjuppnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528036.1798556-813-233708891762636/AnsiballZ_systemd.py'
Feb 19 19:07:16 compute-0 sudo[93177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:16 compute-0 python3.9[93180]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:07:16 compute-0 systemd[1]: Reloading.
Feb 19 19:07:16 compute-0 systemd-rc-local-generator[93205]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:07:16 compute-0 systemd-sysv-generator[93212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:07:17 compute-0 sudo[93177]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:17 compute-0 sudo[93374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhxcwhstotcmunyybbcvivgeqwwynxht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528037.2768416-829-19549402056509/AnsiballZ_stat.py'
Feb 19 19:07:17 compute-0 sudo[93374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:17 compute-0 python3.9[93377]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:07:17 compute-0 sudo[93374]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:17 compute-0 sudo[93453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxcfsdqkfqjyrnuqlnxfklbewzrwyeie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528037.2768416-829-19549402056509/AnsiballZ_file.py'
Feb 19 19:07:17 compute-0 sudo[93453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:18 compute-0 python3.9[93456]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:18 compute-0 sudo[93453]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:18 compute-0 sudo[93606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laexpgbiqcvyhwfhedryqokvtiyzjcvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528038.4016469-853-237417459380948/AnsiballZ_stat.py'
Feb 19 19:07:18 compute-0 sudo[93606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:18 compute-0 python3.9[93609]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:07:18 compute-0 sudo[93606]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:19 compute-0 sudo[93685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzxigrsbxnchhodlngadffscilavtkyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528038.4016469-853-237417459380948/AnsiballZ_file.py'
Feb 19 19:07:19 compute-0 sudo[93685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:19 compute-0 python3.9[93688]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:19 compute-0 sudo[93685]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:19 compute-0 sudo[93838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccpgxahxlydujkfhvchmgmpqkdlwqxcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528039.5536866-877-71282566433566/AnsiballZ_systemd.py'
Feb 19 19:07:19 compute-0 sudo[93838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:20 compute-0 python3.9[93841]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:07:20 compute-0 systemd[1]: Reloading.
Feb 19 19:07:20 compute-0 systemd-rc-local-generator[93869]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:07:20 compute-0 systemd-sysv-generator[93872]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:07:20 compute-0 systemd[1]: Starting Create netns directory...
Feb 19 19:07:20 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 19 19:07:20 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 19 19:07:20 compute-0 systemd[1]: Finished Create netns directory.
Feb 19 19:07:20 compute-0 sudo[93838]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:20 compute-0 sudo[94040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzugqjbzqgxdbpqxmgfejtjuipxrpgdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528040.7140152-897-176824272611546/AnsiballZ_file.py'
Feb 19 19:07:20 compute-0 sudo[94040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:21 compute-0 python3.9[94043]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:07:21 compute-0 sudo[94040]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:21 compute-0 sudo[94193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpjoeqzdbhrijssekhcbxejhewygyuyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528041.3782287-913-20690341742838/AnsiballZ_stat.py'
Feb 19 19:07:21 compute-0 sudo[94193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:21 compute-0 python3.9[94196]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:07:21 compute-0 sudo[94193]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:22 compute-0 sudo[94317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuvfbmhjogxadqoqtlbtkpploxbjhjxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528041.3782287-913-20690341742838/AnsiballZ_copy.py'
Feb 19 19:07:22 compute-0 sudo[94317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:22 compute-0 python3.9[94320]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528041.3782287-913-20690341742838/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:07:22 compute-0 sudo[94317]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:22 compute-0 sudo[94470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdrvgrehuexennrewdwhvdvennwwtppu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528042.6594777-947-257430982034465/AnsiballZ_file.py'
Feb 19 19:07:22 compute-0 sudo[94470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:23 compute-0 python3.9[94473]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:23 compute-0 sudo[94470]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:23 compute-0 sudo[94623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kusomealchunbhptwgyjltqvkgsnieym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528043.388451-963-147796334166886/AnsiballZ_file.py'
Feb 19 19:07:23 compute-0 sudo[94623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:23 compute-0 python3.9[94626]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:07:23 compute-0 sudo[94623]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:24 compute-0 sudo[94776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ditnbdbbehrgqjjffyaxuumhrrcdfzoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528044.1048174-979-270510627266836/AnsiballZ_stat.py'
Feb 19 19:07:24 compute-0 sudo[94776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:24 compute-0 python3.9[94779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:07:24 compute-0 sudo[94776]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:24 compute-0 sudo[94900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xascjyarfjsbjutquheswizesfnqrcvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528044.1048174-979-270510627266836/AnsiballZ_copy.py'
Feb 19 19:07:24 compute-0 sudo[94900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:25 compute-0 python3.9[94903]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528044.1048174-979-270510627266836/.source.json _original_basename=.g1vuljkq follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:25 compute-0 sudo[94900]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:25 compute-0 python3.9[95053]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:28 compute-0 sudo[95474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avkgcukebypxztijwxfoydggcfdxnlxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528047.613747-1059-58405602812027/AnsiballZ_container_config_data.py'
Feb 19 19:07:28 compute-0 sudo[95474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:28 compute-0 python3.9[95477]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Feb 19 19:07:28 compute-0 sudo[95474]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:28 compute-0 sudo[95627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmvqakveorbeagazhxseoiqeyveajxwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528048.6244283-1081-147000244876197/AnsiballZ_container_config_hash.py'
Feb 19 19:07:28 compute-0 sudo[95627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:29 compute-0 python3.9[95630]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 19 19:07:29 compute-0 sudo[95627]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:29 compute-0 sudo[95780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whdleafdudtdvafnukgyzsrtyelnewli ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771528049.494785-1101-205378668940030/AnsiballZ_edpm_container_manage.py'
Feb 19 19:07:29 compute-0 sudo[95780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:30 compute-0 python3[95783]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Feb 19 19:07:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:07:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:07:30 compute-0 podman[95818]: 2026-02-19 19:07:30.260522909 +0000 UTC m=+0.041997928 container create 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 19 19:07:30 compute-0 podman[95818]: 2026-02-19 19:07:30.237265361 +0000 UTC m=+0.018740410 image pull 5a85504e92eb83605c2458586ce4ee1b65cbcc1df72633eee3ac25690daabc9a 38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Feb 19 19:07:30 compute-0 python3[95783]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Feb 19 19:07:30 compute-0 sudo[95780]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:30 compute-0 sudo[96007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbnhkklpcuqqhnydzedmljeerawnhthd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528050.7156842-1117-49765775459679/AnsiballZ_stat.py'
Feb 19 19:07:30 compute-0 sudo[96007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:31 compute-0 python3.9[96010]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:07:31 compute-0 sudo[96007]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 19 19:07:31 compute-0 sudo[96162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyaehwadgggnvqbrlckehmfsysswpffp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528051.4943733-1135-113024402686627/AnsiballZ_file.py'
Feb 19 19:07:31 compute-0 sudo[96162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:31 compute-0 python3.9[96165]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:31 compute-0 sudo[96162]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:32 compute-0 sudo[96239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzgvptllitnuborftbucvtdwhgdchdhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528051.4943733-1135-113024402686627/AnsiballZ_stat.py'
Feb 19 19:07:32 compute-0 sudo[96239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:32 compute-0 python3.9[96242]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:07:32 compute-0 sudo[96239]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:32 compute-0 sudo[96391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrjqquldfhiyhueleumklipapjqgjwgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528052.2498007-1135-263956945247343/AnsiballZ_copy.py'
Feb 19 19:07:32 compute-0 sudo[96391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:32 compute-0 python3.9[96394]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771528052.2498007-1135-263956945247343/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:32 compute-0 sudo[96391]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:32 compute-0 sudo[96468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noljktrhanyhvcspjkkadrkjprrqdhwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528052.2498007-1135-263956945247343/AnsiballZ_systemd.py'
Feb 19 19:07:32 compute-0 sudo[96468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:33 compute-0 python3.9[96471]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 19 19:07:33 compute-0 systemd[1]: Reloading.
Feb 19 19:07:33 compute-0 systemd-sysv-generator[96499]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:07:33 compute-0 systemd-rc-local-generator[96493]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:07:33 compute-0 sudo[96468]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:33 compute-0 sudo[96587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vflzbqlrtbhlrgampnundjlgongrnvdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528052.2498007-1135-263956945247343/AnsiballZ_systemd.py'
Feb 19 19:07:33 compute-0 sudo[96587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:34 compute-0 python3.9[96590]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:07:34 compute-0 systemd[1]: Reloading.
Feb 19 19:07:34 compute-0 systemd-sysv-generator[96617]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:07:34 compute-0 systemd-rc-local-generator[96614]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:07:34 compute-0 systemd[1]: Starting ovn_controller container...
Feb 19 19:07:34 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Feb 19 19:07:34 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:07:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4aae318bc6d198435be0246727dc3d0d9151794cbabb6493129d51414d6a73a6/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 19 19:07:34 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e.
Feb 19 19:07:34 compute-0 podman[96638]: 2026-02-19 19:07:34.373441976 +0000 UTC m=+0.097012836 container init 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260216)
Feb 19 19:07:34 compute-0 ovn_controller[96653]: + sudo -E kolla_set_configs
Feb 19 19:07:34 compute-0 podman[96638]: 2026-02-19 19:07:34.393815197 +0000 UTC m=+0.117386037 container start 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 19 19:07:34 compute-0 edpm-start-podman-container[96638]: ovn_controller
Feb 19 19:07:34 compute-0 systemd[1]: Created slice User Slice of UID 0.
Feb 19 19:07:34 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 19 19:07:34 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 19 19:07:34 compute-0 systemd[1]: Starting User Manager for UID 0...
Feb 19 19:07:34 compute-0 edpm-start-podman-container[96637]: Creating additional drop-in dependency for "ovn_controller" (57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e)
Feb 19 19:07:34 compute-0 systemd[96683]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Feb 19 19:07:34 compute-0 podman[96660]: 2026-02-19 19:07:34.44559743 +0000 UTC m=+0.044619699 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 19 19:07:34 compute-0 systemd[1]: 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e-6ab6949cebfb6c39.service: Main process exited, code=exited, status=1/FAILURE
Feb 19 19:07:34 compute-0 systemd[1]: 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e-6ab6949cebfb6c39.service: Failed with result 'exit-code'.
Feb 19 19:07:34 compute-0 systemd[1]: Reloading.
Feb 19 19:07:34 compute-0 systemd-rc-local-generator[96743]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:07:34 compute-0 systemd-sysv-generator[96746]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:07:34 compute-0 systemd[96683]: Queued start job for default target Main User Target.
Feb 19 19:07:34 compute-0 systemd[96683]: Created slice User Application Slice.
Feb 19 19:07:34 compute-0 systemd[96683]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 19 19:07:34 compute-0 systemd[96683]: Started Daily Cleanup of User's Temporary Directories.
Feb 19 19:07:34 compute-0 systemd[96683]: Reached target Paths.
Feb 19 19:07:34 compute-0 systemd[96683]: Reached target Timers.
Feb 19 19:07:34 compute-0 systemd[96683]: Starting D-Bus User Message Bus Socket...
Feb 19 19:07:34 compute-0 systemd[96683]: Starting Create User's Volatile Files and Directories...
Feb 19 19:07:34 compute-0 systemd[96683]: Finished Create User's Volatile Files and Directories.
Feb 19 19:07:34 compute-0 systemd[96683]: Listening on D-Bus User Message Bus Socket.
Feb 19 19:07:34 compute-0 systemd[96683]: Reached target Sockets.
Feb 19 19:07:34 compute-0 systemd[96683]: Reached target Basic System.
Feb 19 19:07:34 compute-0 systemd[96683]: Reached target Main User Target.
Feb 19 19:07:34 compute-0 systemd[96683]: Startup finished in 145ms.
Feb 19 19:07:34 compute-0 systemd[1]: Started User Manager for UID 0.
Feb 19 19:07:34 compute-0 systemd[1]: Started ovn_controller container.
Feb 19 19:07:34 compute-0 systemd[1]: Started Session c1 of User root.
Feb 19 19:07:34 compute-0 sudo[96587]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:34 compute-0 ovn_controller[96653]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 19 19:07:34 compute-0 ovn_controller[96653]: INFO:__main__:Validating config file
Feb 19 19:07:34 compute-0 ovn_controller[96653]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 19 19:07:34 compute-0 ovn_controller[96653]: INFO:__main__:Writing out command to execute
Feb 19 19:07:34 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Feb 19 19:07:34 compute-0 ovn_controller[96653]: ++ cat /run_command
Feb 19 19:07:34 compute-0 ovn_controller[96653]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 19 19:07:34 compute-0 ovn_controller[96653]: + ARGS=
Feb 19 19:07:34 compute-0 ovn_controller[96653]: + sudo kolla_copy_cacerts
Feb 19 19:07:34 compute-0 systemd[1]: Started Session c2 of User root.
Feb 19 19:07:34 compute-0 ovn_controller[96653]: + [[ ! -n '' ]]
Feb 19 19:07:34 compute-0 ovn_controller[96653]: + . kolla_extend_start
Feb 19 19:07:34 compute-0 ovn_controller[96653]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Feb 19 19:07:34 compute-0 ovn_controller[96653]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 19 19:07:34 compute-0 ovn_controller[96653]: + umask 0022
Feb 19 19:07:34 compute-0 ovn_controller[96653]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Feb 19 19:07:34 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Feb 19 19:07:34 compute-0 ovn_controller[96653]: 2026-02-19T19:07:34Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Feb 19 19:07:34 compute-0 NetworkManager[56519]: <info>  [1771528054.8139] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Feb 19 19:07:34 compute-0 NetworkManager[56519]: <info>  [1771528054.8147] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 19 19:07:34 compute-0 NetworkManager[56519]: <warn>  [1771528054.8149] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 19 19:07:34 compute-0 NetworkManager[56519]: <info>  [1771528054.8157] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Feb 19 19:07:34 compute-0 NetworkManager[56519]: <info>  [1771528054.8163] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Feb 19 19:07:34 compute-0 NetworkManager[56519]: <info>  [1771528054.8166] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 19 19:07:34 compute-0 kernel: br-int: entered promiscuous mode
Feb 19 19:07:34 compute-0 systemd-udevd[96795]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00025|main|INFO|OVS feature set changed, force recompute.
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00029|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00030|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00032|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00033|features|INFO|OVS Feature: meter_support, state: supported
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00034|features|INFO|OVS Feature: group_support, state: supported
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00035|main|INFO|OVS feature set changed, force recompute.
Feb 19 19:07:35 compute-0 NetworkManager[56519]: <info>  [1771528055.8501] manager: (ovn-ad4c1d-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Feb 19 19:07:35 compute-0 ovn_controller[96653]: 2026-02-19T19:07:35Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Feb 19 19:07:35 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Feb 19 19:07:35 compute-0 systemd-udevd[96797]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:07:35 compute-0 NetworkManager[56519]: <info>  [1771528055.8639] device (genev_sys_6081): carrier: link connected
Feb 19 19:07:35 compute-0 NetworkManager[56519]: <info>  [1771528055.8643] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Feb 19 19:07:36 compute-0 python3.9[96926]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 19 19:07:36 compute-0 NetworkManager[56519]: <info>  [1771528056.5551] manager: (ovn-d84819-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Feb 19 19:07:36 compute-0 sudo[97076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygblprmqgisaegclpiejxprxveqnjkdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528056.6114068-1225-34628177304447/AnsiballZ_stat.py'
Feb 19 19:07:36 compute-0 sudo[97076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:36 compute-0 python3.9[97079]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:07:36 compute-0 sudo[97076]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:37 compute-0 sudo[97200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylyjtqguidyjeqocajlldeopwmlupzbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528056.6114068-1225-34628177304447/AnsiballZ_copy.py'
Feb 19 19:07:37 compute-0 sudo[97200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:37 compute-0 python3.9[97203]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528056.6114068-1225-34628177304447/.source.yaml _original_basename=.nru0wq2a follow=False checksum=f7e6682b13861c1cd7babf3c58bd5ecb2f4586a6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:07:37 compute-0 sudo[97200]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:38 compute-0 sudo[97353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otfzgjlttpewvpkqexprrpgwinalwlay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528057.8659883-1255-162798082141890/AnsiballZ_command.py'
Feb 19 19:07:38 compute-0 sudo[97353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:38 compute-0 python3.9[97356]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:07:38 compute-0 ovs-vsctl[97357]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Feb 19 19:07:38 compute-0 sudo[97353]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:38 compute-0 sudo[97507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpyygpwkoopfroslkmqvisjsmmncotsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528058.6123223-1271-154744890707619/AnsiballZ_command.py'
Feb 19 19:07:38 compute-0 sudo[97507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:39 compute-0 python3.9[97510]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:07:39 compute-0 ovs-vsctl[97512]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Feb 19 19:07:39 compute-0 sudo[97507]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:39 compute-0 sudo[97663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urncjwlubgitxdeyvvlxzujlggcmrafe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528059.6144729-1299-9750462003250/AnsiballZ_command.py'
Feb 19 19:07:39 compute-0 sudo[97663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:40 compute-0 python3.9[97666]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:07:40 compute-0 ovs-vsctl[97667]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Feb 19 19:07:40 compute-0 sudo[97663]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:40 compute-0 sshd-session[86076]: Connection closed by 192.168.122.30 port 37324
Feb 19 19:07:40 compute-0 sshd-session[86073]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:07:40 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Feb 19 19:07:40 compute-0 systemd[1]: session-20.scope: Consumed 39.241s CPU time.
Feb 19 19:07:40 compute-0 systemd-logind[822]: Session 20 logged out. Waiting for processes to exit.
Feb 19 19:07:40 compute-0 systemd-logind[822]: Removed session 20.
Feb 19 19:07:44 compute-0 systemd[1]: Stopping User Manager for UID 0...
Feb 19 19:07:44 compute-0 systemd[96683]: Activating special unit Exit the Session...
Feb 19 19:07:44 compute-0 systemd[96683]: Stopped target Main User Target.
Feb 19 19:07:44 compute-0 systemd[96683]: Stopped target Basic System.
Feb 19 19:07:44 compute-0 systemd[96683]: Stopped target Paths.
Feb 19 19:07:44 compute-0 systemd[96683]: Stopped target Sockets.
Feb 19 19:07:44 compute-0 systemd[96683]: Stopped target Timers.
Feb 19 19:07:44 compute-0 systemd[96683]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 19 19:07:44 compute-0 systemd[96683]: Closed D-Bus User Message Bus Socket.
Feb 19 19:07:44 compute-0 systemd[96683]: Stopped Create User's Volatile Files and Directories.
Feb 19 19:07:44 compute-0 systemd[96683]: Removed slice User Application Slice.
Feb 19 19:07:44 compute-0 systemd[96683]: Reached target Shutdown.
Feb 19 19:07:44 compute-0 systemd[96683]: Finished Exit the Session.
Feb 19 19:07:44 compute-0 systemd[96683]: Reached target Exit the Session.
Feb 19 19:07:44 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Feb 19 19:07:44 compute-0 systemd[1]: Stopped User Manager for UID 0.
Feb 19 19:07:44 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 19 19:07:44 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 19 19:07:44 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 19 19:07:44 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 19 19:07:44 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Feb 19 19:07:45 compute-0 sshd-session[97695]: Accepted publickey for zuul from 192.168.122.30 port 50716 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:07:45 compute-0 systemd-logind[822]: New session 22 of user zuul.
Feb 19 19:07:45 compute-0 systemd[1]: Started Session 22 of User zuul.
Feb 19 19:07:45 compute-0 sshd-session[97695]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:07:46 compute-0 python3.9[97848]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:07:47 compute-0 sudo[98002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovrxllxjarnyfkfcuegztvuxipwctfcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528067.62043-43-117824144778249/AnsiballZ_file.py'
Feb 19 19:07:47 compute-0 sudo[98002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:48 compute-0 python3.9[98005]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:07:48 compute-0 sudo[98002]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:48 compute-0 sudo[98155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tercyzmfbzqsadecateltwddfnfimuya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528068.2548718-43-7710677568928/AnsiballZ_file.py'
Feb 19 19:07:48 compute-0 sudo[98155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:48 compute-0 python3.9[98158]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:07:48 compute-0 sudo[98155]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:48 compute-0 sudo[98308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-angrxfhohlqzwqhziehxfjuzycutxqxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528068.7481554-43-124068092316590/AnsiballZ_file.py'
Feb 19 19:07:48 compute-0 sudo[98308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:49 compute-0 python3.9[98311]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:07:49 compute-0 sudo[98308]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:49 compute-0 sudo[98461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtbabclnhubjitnwioadvzmilpwwwcsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528069.247999-43-148234188752869/AnsiballZ_file.py'
Feb 19 19:07:49 compute-0 sudo[98461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:49 compute-0 python3.9[98464]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:07:49 compute-0 sudo[98461]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:50 compute-0 sudo[98614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yblbzvokzeptrbxfkibvtvslpdunplld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528069.8323822-43-82697205022550/AnsiballZ_file.py'
Feb 19 19:07:50 compute-0 sudo[98614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:50 compute-0 python3.9[98617]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:07:50 compute-0 sudo[98614]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:51 compute-0 python3.9[98767]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:07:52 compute-0 sudo[98917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfoplzmqkuomhhbpdpycughxklqtvlzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528071.8436394-131-268417572210743/AnsiballZ_seboolean.py'
Feb 19 19:07:52 compute-0 sudo[98917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:52 compute-0 python3.9[98920]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 19 19:07:52 compute-0 sudo[98917]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:53 compute-0 python3.9[99070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:07:54 compute-0 python3.9[99192]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528073.2005715-147-127149694007706/.source follow=False _original_basename=haproxy.j2 checksum=0593c93f9b31ffdf600eafeda3c342ef9b393062 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:07:54 compute-0 python3.9[99342]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:07:56 compute-0 python3.9[99463]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528074.533444-177-56519789795583/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:07:56 compute-0 sudo[99613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-estisngztljfyyuqcvqhryfxumsvnqkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528076.369232-211-269988966646967/AnsiballZ_setup.py'
Feb 19 19:07:56 compute-0 sudo[99613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:56 compute-0 python3.9[99616]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 19:07:57 compute-0 sudo[99613]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:57 compute-0 sudo[99698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmpptlqcqtfnvlzdzmcmtkaggjokfenj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528076.369232-211-269988966646967/AnsiballZ_dnf.py'
Feb 19 19:07:57 compute-0 sudo[99698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:57 compute-0 python3.9[99701]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:07:59 compute-0 sudo[99698]: pam_unix(sudo:session): session closed for user root
Feb 19 19:07:59 compute-0 sudo[99852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uchdmgsncmbfbpnrloxkrzsxagabckvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528079.1790764-235-8228837012603/AnsiballZ_systemd.py'
Feb 19 19:07:59 compute-0 sudo[99852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:07:59 compute-0 python3.9[99855]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 19 19:07:59 compute-0 sudo[99852]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:00 compute-0 python3.9[100008]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:08:01 compute-0 python3.9[100129]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528080.302065-251-167533304433646/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:08:01 compute-0 python3.9[100279]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:08:02 compute-0 python3.9[100400]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528081.4047515-251-219251416442527/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:08:03 compute-0 python3.9[100550]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:08:03 compute-0 python3.9[100671]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528083.1657872-339-202208998489578/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:08:04 compute-0 python3.9[100821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:08:04 compute-0 ovn_controller[96653]: 2026-02-19T19:08:04Z|00038|memory|INFO|17492 kB peak resident set size after 30.2 seconds
Feb 19 19:08:04 compute-0 ovn_controller[96653]: 2026-02-19T19:08:04Z|00039|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Feb 19 19:08:05 compute-0 podman[100916]: 2026-02-19 19:08:05.014917539 +0000 UTC m=+0.075831117 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:08:05 compute-0 python3.9[100955]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528084.1370826-339-72133959030955/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:08:06 compute-0 python3.9[101118]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:08:06 compute-0 sudo[101270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpugmjmclbluuszybeuxfoulsuhuembe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528086.3568866-415-105625409702282/AnsiballZ_file.py'
Feb 19 19:08:06 compute-0 sudo[101270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:06 compute-0 python3.9[101273]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:08:06 compute-0 sudo[101270]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:07 compute-0 sudo[101423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzwrakngsagmlvgwhvminfjiotgfqefi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528087.0373218-431-198342258795802/AnsiballZ_stat.py'
Feb 19 19:08:07 compute-0 sudo[101423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:07 compute-0 python3.9[101426]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:08:07 compute-0 sudo[101423]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:07 compute-0 sudo[101502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smahotpywqbfaxdittmysxirinsdjpsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528087.0373218-431-198342258795802/AnsiballZ_file.py'
Feb 19 19:08:07 compute-0 sudo[101502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:07 compute-0 python3.9[101505]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:08:07 compute-0 sudo[101502]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:08 compute-0 sudo[101655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxqpyyiyksmssrprdiudhlvzgjazaaff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528087.9465697-431-110988736716245/AnsiballZ_stat.py'
Feb 19 19:08:08 compute-0 sudo[101655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:08 compute-0 python3.9[101658]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:08:08 compute-0 sudo[101655]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:08 compute-0 sudo[101734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aukaniqideaeldtwwmzrwvbzskydpvno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528087.9465697-431-110988736716245/AnsiballZ_file.py'
Feb 19 19:08:08 compute-0 sudo[101734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:08 compute-0 python3.9[101737]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:08:08 compute-0 sudo[101734]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:09 compute-0 sudo[101887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzxludkcsdmulhearrocftxsywsaufzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528089.087331-477-43904541871392/AnsiballZ_file.py'
Feb 19 19:08:09 compute-0 sudo[101887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:09 compute-0 python3.9[101890]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:09 compute-0 sudo[101887]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:10 compute-0 sudo[102040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpcuwdtdhdaxeqhkzwighcbqqaunobwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528089.9035833-493-241937119579550/AnsiballZ_stat.py'
Feb 19 19:08:10 compute-0 sudo[102040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:10 compute-0 python3.9[102043]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:08:10 compute-0 sudo[102040]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:10 compute-0 sudo[102119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbcoxgxfxdcpojzpkqnsoobzllruxpzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528089.9035833-493-241937119579550/AnsiballZ_file.py'
Feb 19 19:08:10 compute-0 sudo[102119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:10 compute-0 python3.9[102122]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:10 compute-0 sudo[102119]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:11 compute-0 sudo[102272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxreofufteidrlmkuaymvtqnlszfydze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528091.2246325-517-44911832397586/AnsiballZ_stat.py'
Feb 19 19:08:11 compute-0 sudo[102272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:11 compute-0 python3.9[102275]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:08:11 compute-0 sudo[102272]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:11 compute-0 sudo[102351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afgetnzikwldhrbuwwxjtudghecmadyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528091.2246325-517-44911832397586/AnsiballZ_file.py'
Feb 19 19:08:11 compute-0 sudo[102351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:12 compute-0 python3.9[102354]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:12 compute-0 sudo[102351]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:12 compute-0 sudo[102504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgkesnazlshrakankdbvhufxwysmevwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528092.2961257-541-137050371370280/AnsiballZ_systemd.py'
Feb 19 19:08:12 compute-0 sudo[102504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:12 compute-0 python3.9[102507]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:08:12 compute-0 systemd[1]: Reloading.
Feb 19 19:08:12 compute-0 systemd-rc-local-generator[102530]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:08:12 compute-0 systemd-sysv-generator[102533]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:08:12 compute-0 sudo[102504]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:13 compute-0 sudo[102701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxggpfinqjspxlliyfumriasxsshlonb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528093.4244177-557-72234374332399/AnsiballZ_stat.py'
Feb 19 19:08:13 compute-0 sudo[102701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:13 compute-0 python3.9[102704]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:08:13 compute-0 sudo[102701]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:14 compute-0 sudo[102780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frsebqfgosvanwhitfcexfqdrkefvopw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528093.4244177-557-72234374332399/AnsiballZ_file.py'
Feb 19 19:08:14 compute-0 sudo[102780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:14 compute-0 python3.9[102783]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:14 compute-0 sudo[102780]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:14 compute-0 sudo[102935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckpaokmthgvpvsgzjexbrztoihsrzrjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528094.5357273-581-196327381891184/AnsiballZ_stat.py'
Feb 19 19:08:14 compute-0 sudo[102935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:14 compute-0 python3.9[102938]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:08:14 compute-0 sudo[102935]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:15 compute-0 sudo[103014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcxrprtjaarostvbioyqadghejeongui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528094.5357273-581-196327381891184/AnsiballZ_file.py'
Feb 19 19:08:15 compute-0 sudo[103014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:15 compute-0 python3.9[103017]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:15 compute-0 sudo[103014]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:15 compute-0 sudo[103167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dijffvpuiqribhyruuhkhcrzpeguuoxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528095.6089575-605-171016671287272/AnsiballZ_systemd.py'
Feb 19 19:08:15 compute-0 sudo[103167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:16 compute-0 sshd-session[102808]: Invalid user systemd from 182.75.216.74 port 57520
Feb 19 19:08:16 compute-0 python3.9[103170]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:08:16 compute-0 systemd[1]: Reloading.
Feb 19 19:08:16 compute-0 systemd-rc-local-generator[103193]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:08:16 compute-0 systemd-sysv-generator[103197]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:08:16 compute-0 sshd-session[102808]: Received disconnect from 182.75.216.74 port 57520:11: Bye Bye [preauth]
Feb 19 19:08:16 compute-0 sshd-session[102808]: Disconnected from invalid user systemd 182.75.216.74 port 57520 [preauth]
Feb 19 19:08:16 compute-0 systemd[1]: Starting Create netns directory...
Feb 19 19:08:16 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 19 19:08:16 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 19 19:08:16 compute-0 systemd[1]: Finished Create netns directory.
Feb 19 19:08:16 compute-0 sudo[103167]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:17 compute-0 sudo[103367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqtolebghpfeqmwjxqcilfdzddlkcvle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528096.9384787-625-5198875888138/AnsiballZ_file.py'
Feb 19 19:08:17 compute-0 sudo[103367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:17 compute-0 python3.9[103370]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:08:17 compute-0 sudo[103367]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:17 compute-0 sudo[103520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bloxczgoulxzsjjxtrucsbpssmpuqglj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528097.6184142-641-12089016648128/AnsiballZ_stat.py'
Feb 19 19:08:17 compute-0 sudo[103520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:18 compute-0 python3.9[103523]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:08:18 compute-0 sudo[103520]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:18 compute-0 sudo[103644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbqztdwzacsdncwisnonwmineugtunnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528097.6184142-641-12089016648128/AnsiballZ_copy.py'
Feb 19 19:08:18 compute-0 sudo[103644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:18 compute-0 python3.9[103647]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528097.6184142-641-12089016648128/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:08:18 compute-0 sudo[103644]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:19 compute-0 sudo[103797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iadxcgpqqrcenaiwgphebwuljthfgjbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528099.044481-675-173587458923803/AnsiballZ_file.py'
Feb 19 19:08:19 compute-0 sudo[103797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:19 compute-0 python3.9[103800]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:19 compute-0 sudo[103797]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:19 compute-0 sudo[103950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbgywpncnqpmfuphkuonahahwnntotsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528099.682142-691-85375769368403/AnsiballZ_file.py'
Feb 19 19:08:19 compute-0 sudo[103950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:20 compute-0 python3.9[103953]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:08:20 compute-0 sudo[103950]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:20 compute-0 sudo[104103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxjlehxzkktinkenaxhxkeeylhqqixlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528100.406901-707-132133859237709/AnsiballZ_stat.py'
Feb 19 19:08:20 compute-0 sudo[104103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:20 compute-0 python3.9[104106]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:08:20 compute-0 sudo[104103]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:20 compute-0 sudo[104227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nydrbyslnqdvwmpumdcrtzytmchtfdmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528100.406901-707-132133859237709/AnsiballZ_copy.py'
Feb 19 19:08:20 compute-0 sudo[104227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:21 compute-0 python3.9[104230]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528100.406901-707-132133859237709/.source.json _original_basename=.l82rh048 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:21 compute-0 sudo[104227]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:21 compute-0 python3.9[104380]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:23 compute-0 sudo[104801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbmzdscsmdwsyyqhvzljqdsthmatzgxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528103.4926393-787-88771209140255/AnsiballZ_container_config_data.py'
Feb 19 19:08:23 compute-0 sudo[104801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:24 compute-0 python3.9[104804]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Feb 19 19:08:24 compute-0 sudo[104801]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:24 compute-0 sudo[104954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzxbsqskemvddzgfruttrtaobrmufuyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528104.3674285-809-952302006655/AnsiballZ_container_config_hash.py'
Feb 19 19:08:24 compute-0 sudo[104954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:24 compute-0 python3.9[104957]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 19 19:08:24 compute-0 sudo[104954]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:25 compute-0 sudo[105107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuahwmreilljtjiwaqaltwlxohcngfij ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771528105.4734182-829-244232918112374/AnsiballZ_edpm_container_manage.py'
Feb 19 19:08:25 compute-0 sudo[105107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:26 compute-0 python3[105110]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 19 19:08:26 compute-0 podman[105145]: 2026-02-19 19:08:26.229690399 +0000 UTC m=+0.040989626 container create 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.43.0)
Feb 19 19:08:26 compute-0 podman[105145]: 2026-02-19 19:08:26.20676063 +0000 UTC m=+0.018059907 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:08:26 compute-0 python3[105110]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:08:26 compute-0 sudo[105107]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:26 compute-0 sudo[105333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzokocctkcvhosfhnsjoexhgbschknkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528106.6552827-845-265042421083043/AnsiballZ_stat.py'
Feb 19 19:08:26 compute-0 sudo[105333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:27 compute-0 python3.9[105336]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:08:27 compute-0 sudo[105333]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:27 compute-0 sudo[105488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yofhyuwenovzhychgvukbdhojsexcxkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528107.3799157-863-86569606655406/AnsiballZ_file.py'
Feb 19 19:08:27 compute-0 sudo[105488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:27 compute-0 python3.9[105491]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:27 compute-0 sudo[105488]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:27 compute-0 sudo[105565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmjiysblvvfpfqmajjujcggcpoeihjgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528107.3799157-863-86569606655406/AnsiballZ_stat.py'
Feb 19 19:08:27 compute-0 sudo[105565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:28 compute-0 python3.9[105568]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:08:28 compute-0 sudo[105565]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:28 compute-0 sudo[105717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqimsmewiadskvhvqcyxdthuubystsda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528108.1986008-863-130143495769065/AnsiballZ_copy.py'
Feb 19 19:08:28 compute-0 sudo[105717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:28 compute-0 python3.9[105720]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771528108.1986008-863-130143495769065/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:28 compute-0 sudo[105717]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:29 compute-0 sudo[105794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oawerghcfpnwputittozvhygxwduguqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528108.1986008-863-130143495769065/AnsiballZ_systemd.py'
Feb 19 19:08:29 compute-0 sudo[105794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:29 compute-0 python3.9[105797]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 19 19:08:29 compute-0 systemd[1]: Reloading.
Feb 19 19:08:29 compute-0 systemd-sysv-generator[105825]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:08:29 compute-0 systemd-rc-local-generator[105818]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:08:29 compute-0 sudo[105794]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:29 compute-0 sudo[105913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lesumpqamkwxorpdhwhsxpfbryxszccq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528108.1986008-863-130143495769065/AnsiballZ_systemd.py'
Feb 19 19:08:29 compute-0 sudo[105913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:29 compute-0 python3.9[105916]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:08:29 compute-0 systemd[1]: Reloading.
Feb 19 19:08:30 compute-0 systemd-rc-local-generator[105944]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:08:30 compute-0 systemd-sysv-generator[105950]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:08:30 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Feb 19 19:08:30 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:08:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6674c277ee82c23057341185bcd18666a85d662022e36167f280d205279813f/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 19 19:08:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6674c277ee82c23057341185bcd18666a85d662022e36167f280d205279813f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:08:30 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1.
Feb 19 19:08:30 compute-0 podman[105965]: 2026-02-19 19:08:30.316702359 +0000 UTC m=+0.127244692 container init 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.43.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: + sudo -E kolla_set_configs
Feb 19 19:08:30 compute-0 podman[105965]: 2026-02-19 19:08:30.344196762 +0000 UTC m=+0.154739085 container start 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 19 19:08:30 compute-0 edpm-start-podman-container[105965]: ovn_metadata_agent
Feb 19 19:08:30 compute-0 edpm-start-podman-container[105964]: Creating additional drop-in dependency for "ovn_metadata_agent" (1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1)
Feb 19 19:08:30 compute-0 podman[105988]: 2026-02-19 19:08:30.399381903 +0000 UTC m=+0.048671272 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Validating config file
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Copying service configuration files
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 19 19:08:30 compute-0 systemd[1]: Reloading.
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Writing out command to execute
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: ++ cat /run_command
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: + CMD=neutron-ovn-metadata-agent
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: + ARGS=
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: + sudo kolla_copy_cacerts
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: + [[ ! -n '' ]]
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: + . kolla_extend_start
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: Running command: 'neutron-ovn-metadata-agent'
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: + umask 0022
Feb 19 19:08:30 compute-0 ovn_metadata_agent[105981]: + exec neutron-ovn-metadata-agent
Feb 19 19:08:30 compute-0 systemd-rc-local-generator[106049]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:08:30 compute-0 systemd-sysv-generator[106053]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:08:30 compute-0 systemd[1]: Started ovn_metadata_agent container.
Feb 19 19:08:30 compute-0 sudo[105913]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:31 compute-0 python3.9[106224]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.064 105986 INFO neutron.common.config [-] Logging enabled!
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.064 105986 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.064 105986 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.064 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.065 105986 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.065 105986 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.065 105986 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.065 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.065 105986 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.065 105986 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.065 105986 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.065 105986 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.065 105986 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.065 105986 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.065 105986 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.065 105986 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.065 105986 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.066 105986 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.067 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.068 105986 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.219 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.069 105986 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.070 105986 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.070 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.070 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.070 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.070 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.070 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.070 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.070 105986 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.070 105986 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.070 105986 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.070 105986 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.070 105986 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.070 105986 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.071 105986 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.072 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.073 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.074 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.075 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.076 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.077 105986 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.078 105986 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.078 105986 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.078 105986 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.078 105986 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.078 105986 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.078 105986 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.078 105986 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.078 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.078 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.078 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.078 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.078 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.079 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.079 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.079 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.079 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.079 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.079 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.079 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.079 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.079 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.079 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.079 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.079 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.080 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.081 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.082 105986 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.083 105986 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.083 105986 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.083 105986 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.083 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.083 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.083 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.083 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.083 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.083 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.083 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.083 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.083 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.083 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.084 105986 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.085 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.086 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.087 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.088 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.089 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.090 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.090 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.090 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.090 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.090 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.090 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.090 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.090 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.090 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.090 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.090 105986 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.090 105986 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.133 105986 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.134 105986 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.134 105986 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.134 105986 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.134 105986 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.147 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name e8e72127-2f6b-43eb-b51a-e32006a33d3c (UUID: e8e72127-2f6b-43eb-b51a-e32006a33d3c) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.180 105986 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.180 105986 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.180 105986 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.180 105986 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.180 105986 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.183 105986 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.191 105986 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.199 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'e8e72127-2f6b-43eb-b51a-e32006a33d3c'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], external_ids={}, name=e8e72127-2f6b-43eb-b51a-e32006a33d3c, nb_cfg_timestamp=1771528063859, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.201 105986 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp_65e_fei/privsep.sock']
Feb 19 19:08:32 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Feb 19 19:08:32 compute-0 sudo[106385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zchytdkrzkwoaffghfagqsjjxxdgekjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528112.6110728-953-257887709894592/AnsiballZ_stat.py'
Feb 19 19:08:32 compute-0 sudo[106385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.875 105986 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.875 105986 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp_65e_fei/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.767 106358 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.769 106358 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.771 106358 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.771 106358 INFO oslo.privsep.daemon [-] privsep daemon running as pid 106358
Feb 19 19:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:32.877 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[21c74902-b58b-46be-8605-61ef0230a81c]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:08:33 compute-0 python3.9[106388]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:08:33 compute-0 sudo[106385]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:33.262 106358 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:08:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:33.262 106358 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:08:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:33.262 106358 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:08:33 compute-0 sudo[106515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bheiwxiwlbinoaatfjgycsmgtlhpdppn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528112.6110728-953-257887709894592/AnsiballZ_copy.py'
Feb 19 19:08:33 compute-0 sudo[106515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:33 compute-0 python3.9[106518]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528112.6110728-953-257887709894592/.source.yaml _original_basename=.qthm680p follow=False checksum=741a9dcd633f184af87cab9bfeeb8a3584450a96 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:33 compute-0 sudo[106515]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:33.669 106358 INFO oslo_service.backend [-] Loading backend: eventlet
Feb 19 19:08:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:33.674 106358 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Feb 19 19:08:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:33.709 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[f77eda0e-6740-4877-8e62-a116124b5332]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:08:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:33.710 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, column=external_ids, values=({'neutron:ovn-metadata-id': '21dd0f3f-df84-50ba-a1bb-a46b2a6bfb4e'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:08:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:33.719 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:08:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:08:33.724 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:08:34 compute-0 sshd-session[97698]: Connection closed by 192.168.122.30 port 50716
Feb 19 19:08:34 compute-0 sshd-session[97695]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:08:34 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Feb 19 19:08:34 compute-0 systemd[1]: session-22.scope: Consumed 28.321s CPU time.
Feb 19 19:08:34 compute-0 systemd-logind[822]: Session 22 logged out. Waiting for processes to exit.
Feb 19 19:08:34 compute-0 systemd-logind[822]: Removed session 22.
Feb 19 19:08:35 compute-0 podman[106543]: 2026-02-19 19:08:35.299434628 +0000 UTC m=+0.075216902 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, container_name=ovn_controller)
Feb 19 19:08:39 compute-0 sshd-session[106569]: Accepted publickey for zuul from 192.168.122.30 port 55414 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:08:39 compute-0 systemd-logind[822]: New session 23 of user zuul.
Feb 19 19:08:39 compute-0 systemd[1]: Started Session 23 of User zuul.
Feb 19 19:08:39 compute-0 sshd-session[106569]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:08:40 compute-0 python3.9[106722]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:08:41 compute-0 sudo[106876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgrzjctbidxjfszpzsulpjpzctrnsasm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528120.678387-43-20286395471094/AnsiballZ_command.py'
Feb 19 19:08:41 compute-0 sudo[106876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:41 compute-0 python3.9[106879]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:08:41 compute-0 sudo[106876]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:42 compute-0 sudo[107042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mghajjuppcyvwnzjtjqdtlaintptnqxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528121.7065618-65-242973702041829/AnsiballZ_systemd_service.py'
Feb 19 19:08:42 compute-0 sudo[107042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:42 compute-0 python3.9[107045]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 19 19:08:42 compute-0 systemd[1]: Reloading.
Feb 19 19:08:42 compute-0 systemd-rc-local-generator[107067]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:08:42 compute-0 systemd-sysv-generator[107071]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:08:42 compute-0 sudo[107042]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:43 compute-0 python3.9[107236]: ansible-ansible.builtin.service_facts Invoked
Feb 19 19:08:43 compute-0 network[107253]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 19 19:08:43 compute-0 network[107254]: 'network-scripts' will be removed from distribution in near future.
Feb 19 19:08:43 compute-0 network[107255]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 19 19:08:46 compute-0 sudo[107515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqgtpvyjhasrdsdoywrefossevgvesmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528126.5618775-103-45456961410490/AnsiballZ_systemd_service.py'
Feb 19 19:08:46 compute-0 sudo[107515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:47 compute-0 python3.9[107518]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:08:47 compute-0 sudo[107515]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:47 compute-0 sudo[107669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zikxmtqsvukgiucsihdlieqqyzfntzjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528127.1636002-103-201043657684288/AnsiballZ_systemd_service.py'
Feb 19 19:08:47 compute-0 sudo[107669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:47 compute-0 python3.9[107672]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:08:47 compute-0 sudo[107669]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:48 compute-0 sudo[107823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlebsgtqbehjnhhpqkoaalnbvpcnhbeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528127.8688443-103-12620061084255/AnsiballZ_systemd_service.py'
Feb 19 19:08:48 compute-0 sudo[107823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:48 compute-0 python3.9[107826]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:08:48 compute-0 sudo[107823]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:48 compute-0 sudo[107977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upvyxieksthtipblbzrvvhzmhwscgxpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528128.587665-103-10211636054864/AnsiballZ_systemd_service.py'
Feb 19 19:08:48 compute-0 sudo[107977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:49 compute-0 python3.9[107980]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:08:49 compute-0 sudo[107977]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:49 compute-0 sudo[108131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dowmdpucdgvzvttphhqpoklxmqlmcevw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528129.3101168-103-73074579782620/AnsiballZ_systemd_service.py'
Feb 19 19:08:49 compute-0 sudo[108131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:49 compute-0 python3.9[108134]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:08:49 compute-0 sudo[108131]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:50 compute-0 sudo[108285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fopyfkcadinmpgtgvhnzqqrsucpimsdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528130.0163445-103-36963782261937/AnsiballZ_systemd_service.py'
Feb 19 19:08:50 compute-0 sudo[108285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:50 compute-0 python3.9[108288]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:08:50 compute-0 sudo[108285]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:50 compute-0 sudo[108439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upcoucgcrtfxiajamobjmgztwswkxjjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528130.7302923-103-268998671915583/AnsiballZ_systemd_service.py'
Feb 19 19:08:50 compute-0 sudo[108439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:51 compute-0 python3.9[108442]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:08:51 compute-0 sudo[108439]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:52 compute-0 sudo[108593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugcaaconzhezkongfriqgghfbohgxfum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528131.957278-207-233023471762633/AnsiballZ_file.py'
Feb 19 19:08:52 compute-0 sudo[108593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:52 compute-0 python3.9[108596]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:52 compute-0 sudo[108593]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:52 compute-0 sudo[108746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrlilxtqkhwipugabrbfbjsaqyzhervc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528132.6281657-207-272056436779517/AnsiballZ_file.py'
Feb 19 19:08:52 compute-0 sudo[108746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:53 compute-0 python3.9[108749]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:53 compute-0 sudo[108746]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:53 compute-0 sudo[108899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhhtsflsjqraxskelyqlzzywkqmeveom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528133.1089513-207-10403248355552/AnsiballZ_file.py'
Feb 19 19:08:53 compute-0 sudo[108899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:53 compute-0 python3.9[108902]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:53 compute-0 sudo[108899]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:53 compute-0 sudo[109052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvibjvjjmhwwyxjqnwyjzsrlurtntkds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528133.5791898-207-57569382247827/AnsiballZ_file.py'
Feb 19 19:08:53 compute-0 sudo[109052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:54 compute-0 python3.9[109055]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:54 compute-0 sudo[109052]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:54 compute-0 sudo[109205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxzmffacwjkbqelsxejjpyrmilugbndx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528134.138588-207-159363228746605/AnsiballZ_file.py'
Feb 19 19:08:54 compute-0 sudo[109205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:54 compute-0 python3.9[109208]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:54 compute-0 sudo[109205]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:54 compute-0 sudo[109358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjlcmtkdniulvoyqelbnqihhnicmmwvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528134.6680732-207-227232440078983/AnsiballZ_file.py'
Feb 19 19:08:54 compute-0 sudo[109358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:55 compute-0 python3.9[109361]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:55 compute-0 sudo[109358]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:55 compute-0 sudo[109511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gremucykyfrhjevttxwefkhkjzhtnzfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528135.1643255-207-87465544977504/AnsiballZ_file.py'
Feb 19 19:08:55 compute-0 sudo[109511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:55 compute-0 python3.9[109514]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:55 compute-0 sudo[109511]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:56 compute-0 sudo[109664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdcgvekwlvahrdkjvnmxpcmkkanwevyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528136.4093845-307-28247682435528/AnsiballZ_file.py'
Feb 19 19:08:56 compute-0 sudo[109664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:56 compute-0 python3.9[109667]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:56 compute-0 sudo[109664]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:57 compute-0 sudo[109817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqutycnbfqyllagcuhehzujhtkupzcpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528136.9305449-307-60994942069453/AnsiballZ_file.py'
Feb 19 19:08:57 compute-0 sudo[109817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:57 compute-0 python3.9[109820]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:57 compute-0 sudo[109817]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:57 compute-0 sudo[109970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwgwiqvpideilqukjymmmcfoichrvcdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528137.425479-307-212865207045969/AnsiballZ_file.py'
Feb 19 19:08:57 compute-0 sudo[109970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:57 compute-0 python3.9[109973]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:57 compute-0 sudo[109970]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:58 compute-0 sudo[110123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmjkfpsvajdahcdjrblhijpwwwlxwyvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528137.9434028-307-253667107212182/AnsiballZ_file.py'
Feb 19 19:08:58 compute-0 sudo[110123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:58 compute-0 python3.9[110126]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:58 compute-0 sudo[110123]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:58 compute-0 sudo[110276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxrubpxjgrmkfibzslprlvvrjnillwgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528138.502495-307-24880593297195/AnsiballZ_file.py'
Feb 19 19:08:58 compute-0 sudo[110276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:58 compute-0 python3.9[110279]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:58 compute-0 sudo[110276]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:59 compute-0 sudo[110429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlbruzrhcuhigywyloqhqjmsytgaclrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528138.9591162-307-12258593101905/AnsiballZ_file.py'
Feb 19 19:08:59 compute-0 sudo[110429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:59 compute-0 python3.9[110432]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:59 compute-0 sudo[110429]: pam_unix(sudo:session): session closed for user root
Feb 19 19:08:59 compute-0 sudo[110582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcrbqdmgmrerosvtgxcgxffblinlftor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528139.6036785-307-82996618190752/AnsiballZ_file.py'
Feb 19 19:08:59 compute-0 sudo[110582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:08:59 compute-0 python3.9[110585]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:08:59 compute-0 sudo[110582]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:01 compute-0 sudo[110746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rngmblcsmvdgkwumoyrspfmlhipxdwzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528140.9035156-409-104030368737017/AnsiballZ_command.py'
Feb 19 19:09:01 compute-0 sudo[110746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:01 compute-0 podman[110709]: 2026-02-19 19:09:01.134711522 +0000 UTC m=+0.047500618 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Feb 19 19:09:01 compute-0 python3.9[110755]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:09:01 compute-0 sudo[110746]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:02 compute-0 python3.9[110909]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 19 19:09:02 compute-0 sudo[111059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpwdjmyohhdkoqvmqplllckhaoafruje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528142.4486513-445-265786792996428/AnsiballZ_systemd_service.py'
Feb 19 19:09:02 compute-0 sudo[111059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:02 compute-0 python3.9[111062]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 19 19:09:02 compute-0 systemd[1]: Reloading.
Feb 19 19:09:02 compute-0 systemd-rc-local-generator[111087]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:09:02 compute-0 systemd-sysv-generator[111090]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:09:03 compute-0 sudo[111059]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:03 compute-0 sudo[111255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbqeovehkuwsdlelyrizqffgopwqmemz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528143.3956356-461-140728722004796/AnsiballZ_command.py'
Feb 19 19:09:03 compute-0 sudo[111255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:03 compute-0 python3.9[111258]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:09:03 compute-0 sudo[111255]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:04 compute-0 sudo[111409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myctsktjjpfmjoqveymrpjmltnrgthuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528143.83153-461-153547440276513/AnsiballZ_command.py'
Feb 19 19:09:04 compute-0 sudo[111409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:04 compute-0 python3.9[111412]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:09:04 compute-0 sudo[111409]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:04 compute-0 sudo[111563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfyzyjrremcswlrjkelwjxrcuxcezxvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528144.3402503-461-252229279343687/AnsiballZ_command.py'
Feb 19 19:09:04 compute-0 sudo[111563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:04 compute-0 python3.9[111566]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:09:04 compute-0 sudo[111563]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:05 compute-0 sudo[111717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpcjhndxlqspylruxzaikspeiomqbkai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528144.888082-461-273932928894563/AnsiballZ_command.py'
Feb 19 19:09:05 compute-0 sudo[111717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:05 compute-0 python3.9[111720]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:09:05 compute-0 sudo[111717]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:05 compute-0 podman[111722]: 2026-02-19 19:09:05.454993114 +0000 UTC m=+0.095285663 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=watcher_latest)
Feb 19 19:09:05 compute-0 sudo[111898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulsxcrakbetmrqveseniajndpdyevfum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528145.4779568-461-169599254321823/AnsiballZ_command.py'
Feb 19 19:09:05 compute-0 sudo[111898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:05 compute-0 python3.9[111901]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:09:05 compute-0 sudo[111898]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:06 compute-0 sudo[112052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zksqlgwmqtywiibpsgtpvqqaatrrjugr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528146.0079052-461-92814773792113/AnsiballZ_command.py'
Feb 19 19:09:06 compute-0 sudo[112052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:06 compute-0 python3.9[112055]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:09:06 compute-0 sudo[112052]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:06 compute-0 sudo[112206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpxqoskyttwecmetbxhxbsppojphjzul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528146.5547998-461-132740736616491/AnsiballZ_command.py'
Feb 19 19:09:06 compute-0 sudo[112206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:06 compute-0 python3.9[112209]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:09:07 compute-0 sudo[112206]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:08 compute-0 sudo[112360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdekmbmtulasnscrlroehuhorrpgwvtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528148.0965903-569-13110275054110/AnsiballZ_getent.py'
Feb 19 19:09:08 compute-0 sudo[112360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:08 compute-0 python3.9[112363]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Feb 19 19:09:08 compute-0 sudo[112360]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:09 compute-0 sudo[112514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqjeleubgvfrpnvrugzmjqmommypzmde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528148.8562593-585-221541498603150/AnsiballZ_group.py'
Feb 19 19:09:09 compute-0 sudo[112514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:09 compute-0 python3.9[112517]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 19 19:09:09 compute-0 groupadd[112518]: group added to /etc/group: name=libvirt, GID=42473
Feb 19 19:09:09 compute-0 groupadd[112518]: group added to /etc/gshadow: name=libvirt
Feb 19 19:09:09 compute-0 groupadd[112518]: new group: name=libvirt, GID=42473
Feb 19 19:09:09 compute-0 sudo[112514]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:10 compute-0 sudo[112673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yujciqknegkpceywlxyeiwnmxgkrxklr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528149.8205836-601-230426830942349/AnsiballZ_user.py'
Feb 19 19:09:10 compute-0 sudo[112673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:10 compute-0 python3.9[112676]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 19 19:09:10 compute-0 useradd[112678]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/1
Feb 19 19:09:10 compute-0 sudo[112673]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:11 compute-0 sudo[112834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xabmzuzhuycjtrannhtuyczaredeyzav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528151.3307118-623-40048298585388/AnsiballZ_setup.py'
Feb 19 19:09:11 compute-0 sudo[112834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:11 compute-0 python3.9[112837]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 19:09:12 compute-0 sudo[112834]: pam_unix(sudo:session): session closed for user root
Feb 19 19:09:12 compute-0 sudo[112919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucasqcdvkdgkptvweuzytojvkdagwsuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528151.3307118-623-40048298585388/AnsiballZ_dnf.py'
Feb 19 19:09:12 compute-0 sudo[112919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:09:12 compute-0 python3.9[112922]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:09:31 compute-0 podman[113112]: 2026-02-19 19:09:31.268589203 +0000 UTC m=+0.046577286 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Feb 19 19:09:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:09:32.092 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:09:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:09:32.092 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:09:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:09:32.092 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:09:36 compute-0 podman[113133]: 2026-02-19 19:09:36.35136133 +0000 UTC m=+0.129242252 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260216)
Feb 19 19:09:38 compute-0 kernel: SELinux:  Converting 2767 SID table entries...
Feb 19 19:09:38 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 19 19:09:38 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 19 19:09:38 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 19 19:09:38 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 19 19:09:38 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 19 19:09:38 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 19 19:09:38 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 19 19:09:47 compute-0 kernel: SELinux:  Converting 2767 SID table entries...
Feb 19 19:09:47 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 19 19:09:47 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 19 19:09:47 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 19 19:09:47 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 19 19:09:47 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 19 19:09:47 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 19 19:09:47 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 19 19:10:00 compute-0 sshd-session[115345]: Received disconnect from 43.166.137.151 port 36534:11: Bye Bye [preauth]
Feb 19 19:10:00 compute-0 sshd-session[115345]: Disconnected from authenticating user root 43.166.137.151 port 36534 [preauth]
Feb 19 19:10:02 compute-0 dbus-broker-launch[804]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Feb 19 19:10:02 compute-0 podman[117221]: 2026-02-19 19:10:02.2954066 +0000 UTC m=+0.060819767 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 19 19:10:07 compute-0 podman[122537]: 2026-02-19 19:10:07.280309386 +0000 UTC m=+0.058075937 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:10:08 compute-0 sshd-session[122941]: Invalid user admin from 197.211.55.20 port 41256
Feb 19 19:10:08 compute-0 sshd-session[122941]: Received disconnect from 197.211.55.20 port 41256:11: Bye Bye [preauth]
Feb 19 19:10:08 compute-0 sshd-session[122941]: Disconnected from invalid user admin 197.211.55.20 port 41256 [preauth]
Feb 19 19:10:11 compute-0 sshd-session[126535]: Invalid user mailuser from 138.255.157.62 port 26249
Feb 19 19:10:12 compute-0 sshd-session[126535]: Received disconnect from 138.255.157.62 port 26249:11: Bye Bye [preauth]
Feb 19 19:10:12 compute-0 sshd-session[126535]: Disconnected from invalid user mailuser 138.255.157.62 port 26249 [preauth]
Feb 19 19:10:26 compute-0 kernel: SELinux:  Converting 2768 SID table entries...
Feb 19 19:10:26 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 19 19:10:26 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 19 19:10:26 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 19 19:10:26 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 19 19:10:26 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 19 19:10:26 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 19 19:10:26 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 19 19:10:27 compute-0 groupadd[130133]: group added to /etc/group: name=dnsmasq, GID=993
Feb 19 19:10:27 compute-0 groupadd[130133]: group added to /etc/gshadow: name=dnsmasq
Feb 19 19:10:27 compute-0 groupadd[130133]: new group: name=dnsmasq, GID=993
Feb 19 19:10:27 compute-0 useradd[130140]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Feb 19 19:10:27 compute-0 dbus-broker-launch[794]: Noticed file-system modification, trigger reload.
Feb 19 19:10:27 compute-0 dbus-broker-launch[804]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Feb 19 19:10:27 compute-0 dbus-broker-launch[794]: Noticed file-system modification, trigger reload.
Feb 19 19:10:27 compute-0 groupadd[130153]: group added to /etc/group: name=clevis, GID=992
Feb 19 19:10:28 compute-0 groupadd[130153]: group added to /etc/gshadow: name=clevis
Feb 19 19:10:28 compute-0 groupadd[130153]: new group: name=clevis, GID=992
Feb 19 19:10:28 compute-0 useradd[130160]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Feb 19 19:10:28 compute-0 usermod[130170]: add 'clevis' to group 'tss'
Feb 19 19:10:28 compute-0 usermod[130170]: add 'clevis' to shadow group 'tss'
Feb 19 19:10:29 compute-0 polkitd[44509]: Reloading rules
Feb 19 19:10:29 compute-0 polkitd[44509]: Collecting garbage unconditionally...
Feb 19 19:10:29 compute-0 polkitd[44509]: Loading rules from directory /etc/polkit-1/rules.d
Feb 19 19:10:29 compute-0 polkitd[44509]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 19 19:10:29 compute-0 polkitd[44509]: Finished loading, compiling and executing 3 rules
Feb 19 19:10:29 compute-0 polkitd[44509]: Reloading rules
Feb 19 19:10:29 compute-0 polkitd[44509]: Collecting garbage unconditionally...
Feb 19 19:10:29 compute-0 polkitd[44509]: Loading rules from directory /etc/polkit-1/rules.d
Feb 19 19:10:29 compute-0 polkitd[44509]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 19 19:10:29 compute-0 polkitd[44509]: Finished loading, compiling and executing 3 rules
Feb 19 19:10:30 compute-0 groupadd[130360]: group added to /etc/group: name=ceph, GID=167
Feb 19 19:10:30 compute-0 groupadd[130360]: group added to /etc/gshadow: name=ceph
Feb 19 19:10:30 compute-0 groupadd[130360]: new group: name=ceph, GID=167
Feb 19 19:10:30 compute-0 useradd[130366]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Feb 19 19:10:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:10:32.093 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:10:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:10:32.094 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:10:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:10:32.094 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:10:32 compute-0 podman[130376]: 2026-02-19 19:10:32.636901474 +0000 UTC m=+0.057867212 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Feb 19 19:10:33 compute-0 sshd[1020]: Received signal 15; terminating.
Feb 19 19:10:33 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Feb 19 19:10:33 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Feb 19 19:10:33 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Feb 19 19:10:33 compute-0 systemd[1]: sshd.service: Consumed 2.892s CPU time, read 32.0K from disk, written 84.0K to disk.
Feb 19 19:10:33 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Feb 19 19:10:33 compute-0 systemd[1]: Stopping sshd-keygen.target...
Feb 19 19:10:33 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 19 19:10:33 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 19 19:10:33 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 19 19:10:33 compute-0 systemd[1]: Reached target sshd-keygen.target.
Feb 19 19:10:33 compute-0 systemd[1]: Starting OpenSSH server daemon...
Feb 19 19:10:33 compute-0 sshd[130905]: Server listening on 0.0.0.0 port 22.
Feb 19 19:10:33 compute-0 sshd[130905]: Server listening on :: port 22.
Feb 19 19:10:33 compute-0 systemd[1]: Started OpenSSH server daemon.
Feb 19 19:10:34 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 19 19:10:34 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 19 19:10:34 compute-0 systemd[1]: Reloading.
Feb 19 19:10:34 compute-0 systemd-rc-local-generator[131165]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:10:34 compute-0 systemd-sysv-generator[131168]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:10:34 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 19 19:10:36 compute-0 sudo[112919]: pam_unix(sudo:session): session closed for user root
Feb 19 19:10:38 compute-0 podman[136912]: 2026-02-19 19:10:38.342362529 +0000 UTC m=+0.120671289 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.43.0)
Feb 19 19:10:40 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 19 19:10:40 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 19 19:10:40 compute-0 systemd[1]: man-db-cache-update.service: Consumed 6.785s CPU time.
Feb 19 19:10:40 compute-0 systemd[1]: run-rbf45a44883ad46abb55ab9cfc50d72f6.service: Deactivated successfully.
Feb 19 19:10:46 compute-0 sudo[139755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nflwonukccjikgrkfdlntziffowesbxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528246.3083496-647-170792065653839/AnsiballZ_systemd.py'
Feb 19 19:10:46 compute-0 sudo[139755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:10:47 compute-0 python3.9[139758]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 19 19:10:47 compute-0 systemd[1]: Reloading.
Feb 19 19:10:47 compute-0 systemd-rc-local-generator[139783]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:10:47 compute-0 systemd-sysv-generator[139788]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:10:47 compute-0 sudo[139755]: pam_unix(sudo:session): session closed for user root
Feb 19 19:10:47 compute-0 sudo[139953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esmptkmvvrqgwirledojwouqfazcgfph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528247.545723-647-45478560383506/AnsiballZ_systemd.py'
Feb 19 19:10:47 compute-0 sudo[139953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:10:48 compute-0 python3.9[139956]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 19 19:10:48 compute-0 systemd[1]: Reloading.
Feb 19 19:10:48 compute-0 systemd-sysv-generator[139988]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:10:48 compute-0 systemd-rc-local-generator[139983]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:10:48 compute-0 sudo[139953]: pam_unix(sudo:session): session closed for user root
Feb 19 19:10:48 compute-0 sudo[140153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbgcyhucoxkwetgcmefwjxzptvyrmftp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528248.4485312-647-65506898377007/AnsiballZ_systemd.py'
Feb 19 19:10:48 compute-0 sudo[140153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:10:48 compute-0 python3.9[140156]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 19 19:10:48 compute-0 systemd[1]: Reloading.
Feb 19 19:10:49 compute-0 systemd-rc-local-generator[140177]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:10:49 compute-0 systemd-sysv-generator[140181]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:10:49 compute-0 sudo[140153]: pam_unix(sudo:session): session closed for user root
Feb 19 19:10:49 compute-0 sudo[140351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lblasnyinuskcbxdgilgpknxizqqcndg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528249.250481-647-240388799205097/AnsiballZ_systemd.py'
Feb 19 19:10:49 compute-0 sudo[140351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:10:49 compute-0 sshd-session[140001]: Invalid user shreyas from 27.50.25.190 port 34452
Feb 19 19:10:49 compute-0 python3.9[140354]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 19 19:10:49 compute-0 sshd-session[140001]: Received disconnect from 27.50.25.190 port 34452:11: Bye Bye [preauth]
Feb 19 19:10:49 compute-0 sshd-session[140001]: Disconnected from invalid user shreyas 27.50.25.190 port 34452 [preauth]
Feb 19 19:10:49 compute-0 systemd[1]: Reloading.
Feb 19 19:10:49 compute-0 systemd-rc-local-generator[140375]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:10:49 compute-0 systemd-sysv-generator[140385]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:10:50 compute-0 sudo[140351]: pam_unix(sudo:session): session closed for user root
Feb 19 19:10:51 compute-0 sudo[140549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rumittgykrzowryjaotdktrxljlybyay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528250.8706021-705-31849550499953/AnsiballZ_systemd.py'
Feb 19 19:10:51 compute-0 sudo[140549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:10:51 compute-0 python3.9[140552]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:10:51 compute-0 systemd[1]: Reloading.
Feb 19 19:10:51 compute-0 systemd-rc-local-generator[140587]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:10:51 compute-0 systemd-sysv-generator[140592]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:10:51 compute-0 sudo[140549]: pam_unix(sudo:session): session closed for user root
Feb 19 19:10:52 compute-0 sudo[140747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akdvdzefailbcsevfffkyezdoimpmpcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528251.7753317-705-122872491603943/AnsiballZ_systemd.py'
Feb 19 19:10:52 compute-0 sudo[140747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:10:52 compute-0 python3.9[140750]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:10:52 compute-0 systemd[1]: Reloading.
Feb 19 19:10:52 compute-0 systemd-sysv-generator[140788]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:10:52 compute-0 systemd-rc-local-generator[140784]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:10:52 compute-0 sudo[140747]: pam_unix(sudo:session): session closed for user root
Feb 19 19:10:52 compute-0 sudo[140945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htrvosamjjnpzujzvkgmnkbhjocyvbfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528252.6837096-705-160674853114227/AnsiballZ_systemd.py'
Feb 19 19:10:52 compute-0 sudo[140945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:10:53 compute-0 python3.9[140948]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:10:53 compute-0 systemd[1]: Reloading.
Feb 19 19:10:53 compute-0 systemd-rc-local-generator[140970]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:10:53 compute-0 systemd-sysv-generator[140974]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:10:53 compute-0 sudo[140945]: pam_unix(sudo:session): session closed for user root
Feb 19 19:10:53 compute-0 sudo[141142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgceuhgphpuznrdhihdtljolgmnfbmfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528253.5952988-705-31882174819498/AnsiballZ_systemd.py'
Feb 19 19:10:53 compute-0 sudo[141142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:10:54 compute-0 python3.9[141145]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:10:54 compute-0 sudo[141142]: pam_unix(sudo:session): session closed for user root
Feb 19 19:10:54 compute-0 sudo[141298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aumywfobfydgcidgzqremahkderdnycx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528254.312572-705-75956655001111/AnsiballZ_systemd.py'
Feb 19 19:10:54 compute-0 sudo[141298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:10:54 compute-0 python3.9[141301]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:10:54 compute-0 systemd[1]: Reloading.
Feb 19 19:10:55 compute-0 systemd-rc-local-generator[141334]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:10:55 compute-0 systemd-sysv-generator[141341]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:10:55 compute-0 sudo[141298]: pam_unix(sudo:session): session closed for user root
Feb 19 19:10:58 compute-0 sudo[141497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-othcclyscwxllsuxfrxfwxvxdmomersm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528257.7609289-777-132277980951777/AnsiballZ_systemd.py'
Feb 19 19:10:58 compute-0 sudo[141497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:10:58 compute-0 python3.9[141500]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 19 19:10:58 compute-0 systemd[1]: Reloading.
Feb 19 19:10:58 compute-0 systemd-rc-local-generator[141527]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:10:58 compute-0 systemd-sysv-generator[141532]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:10:58 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Feb 19 19:10:58 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Feb 19 19:10:58 compute-0 sudo[141497]: pam_unix(sudo:session): session closed for user root
Feb 19 19:10:59 compute-0 sudo[141699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nggpsxpitgitveqfyxkadjdjomaeaurx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528258.9205892-793-276194538011182/AnsiballZ_systemd.py'
Feb 19 19:10:59 compute-0 sudo[141699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:10:59 compute-0 python3.9[141702]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:10:59 compute-0 sudo[141699]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:00 compute-0 sudo[141855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzhvinpitpgzwstbouxrzjonchskgxyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528259.7327118-793-33009251967336/AnsiballZ_systemd.py'
Feb 19 19:11:00 compute-0 sudo[141855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:00 compute-0 python3.9[141858]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:11:00 compute-0 sudo[141855]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:00 compute-0 sudo[142011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycbcqgsjqecmfuqbwntlpfarehyolmzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528260.5388656-793-30500595938776/AnsiballZ_systemd.py'
Feb 19 19:11:00 compute-0 sudo[142011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:01 compute-0 python3.9[142014]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:11:01 compute-0 sudo[142011]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:01 compute-0 sudo[142167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdkcotxrfklgperzfubzqjayavvuvtjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528261.2639759-793-265763908043257/AnsiballZ_systemd.py'
Feb 19 19:11:01 compute-0 sudo[142167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:01 compute-0 python3.9[142170]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:11:01 compute-0 sudo[142167]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:02 compute-0 sudo[142323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqnbowzmtqykvbbptnvdznbtugarhkxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528261.923654-793-90815764757407/AnsiballZ_systemd.py'
Feb 19 19:11:02 compute-0 sudo[142323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:02 compute-0 python3.9[142326]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:11:02 compute-0 sudo[142323]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:02 compute-0 sudo[142490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjctmibvuitqseqzkmggpgjyfpgyrsya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528262.6034951-793-28003622748299/AnsiballZ_systemd.py'
Feb 19 19:11:02 compute-0 sudo[142490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:02 compute-0 podman[142453]: 2026-02-19 19:11:02.9372308 +0000 UTC m=+0.080578683 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 19:11:03 compute-0 python3.9[142498]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:11:03 compute-0 sudo[142490]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:03 compute-0 sudo[142652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcrhkryybgyeschdtzltrgfqoifgayqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528263.4234772-793-17079333342286/AnsiballZ_systemd.py'
Feb 19 19:11:03 compute-0 sudo[142652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:03 compute-0 python3.9[142655]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:11:04 compute-0 sudo[142652]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:04 compute-0 sudo[142808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuajbhvzlahuznefowwngkzfkwbsfhsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528264.1232548-793-177977388876091/AnsiballZ_systemd.py'
Feb 19 19:11:04 compute-0 sudo[142808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:04 compute-0 python3.9[142811]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:11:04 compute-0 sudo[142808]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:05 compute-0 sudo[142964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pacjoyvcvvwasmfzqusvdszpwcjvtfzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528264.97081-793-209690344295066/AnsiballZ_systemd.py'
Feb 19 19:11:05 compute-0 sudo[142964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:05 compute-0 python3.9[142967]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:11:05 compute-0 sudo[142964]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:05 compute-0 sudo[143120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udypwfxglnsyyahpbqujrslizqoxlzmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528265.6967692-793-33482293416753/AnsiballZ_systemd.py'
Feb 19 19:11:05 compute-0 sudo[143120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:06 compute-0 python3.9[143123]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:11:06 compute-0 sudo[143120]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:06 compute-0 sudo[143276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpmknchblsyfbbyjlckcwdlcmboltrmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528266.4873855-793-192421652371631/AnsiballZ_systemd.py'
Feb 19 19:11:06 compute-0 sudo[143276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:06 compute-0 python3.9[143279]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:11:07 compute-0 sudo[143276]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:07 compute-0 sudo[143432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-parsqxbkevzztkmqnxyfevkclojudewl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528267.2166138-793-132281781918474/AnsiballZ_systemd.py'
Feb 19 19:11:07 compute-0 sudo[143432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:07 compute-0 python3.9[143435]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:11:07 compute-0 sudo[143432]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:08 compute-0 sudo[143588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwnyqlghhprgysjzucfecybytxlaijod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528268.035493-793-72164295504572/AnsiballZ_systemd.py'
Feb 19 19:11:08 compute-0 sudo[143588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:08 compute-0 python3.9[143591]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:11:08 compute-0 sudo[143588]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:08 compute-0 podman[143593]: 2026-02-19 19:11:08.650478643 +0000 UTC m=+0.083557989 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:11:08 compute-0 sudo[143771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aztffcrbvnfzymhnedpunyegmojgibmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528268.7387676-793-262719127680979/AnsiballZ_systemd.py'
Feb 19 19:11:08 compute-0 sudo[143771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:09 compute-0 python3.9[143774]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 19 19:11:09 compute-0 sudo[143771]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:10 compute-0 sudo[143927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqzrayspefgndduzyrfcfgpvenjpzyzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528270.3333168-997-171895766472047/AnsiballZ_file.py'
Feb 19 19:11:10 compute-0 sudo[143927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:10 compute-0 python3.9[143930]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:11:10 compute-0 sudo[143927]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:11 compute-0 sudo[144080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lswjrduywjsoggnfqciivhmefvtheoav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528270.8960228-997-215237361967410/AnsiballZ_file.py'
Feb 19 19:11:11 compute-0 sudo[144080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:11 compute-0 python3.9[144083]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:11:11 compute-0 sudo[144080]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:11 compute-0 sudo[144233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwrznxgzftsarcoyozwwfjatlwqlfabx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528271.4689834-997-232180339562851/AnsiballZ_file.py'
Feb 19 19:11:11 compute-0 sudo[144233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:11 compute-0 python3.9[144236]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:11:11 compute-0 sudo[144233]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:12 compute-0 sudo[144386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmydnmfhwatrvihfhdwvjaxbzshhtnrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528271.9751377-997-141968432661257/AnsiballZ_file.py'
Feb 19 19:11:12 compute-0 sudo[144386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:12 compute-0 python3.9[144389]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:11:12 compute-0 sudo[144386]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:12 compute-0 sudo[144539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-passbquwaknxzptrpbhjckbggbtytelb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528272.5551586-997-251245573261240/AnsiballZ_file.py'
Feb 19 19:11:12 compute-0 sudo[144539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:12 compute-0 python3.9[144542]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:11:13 compute-0 sudo[144539]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:13 compute-0 sudo[144692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvtdevljjovpbuwybsfowyauxznrzqjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528273.1469045-997-170070047570760/AnsiballZ_file.py'
Feb 19 19:11:13 compute-0 sudo[144692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:13 compute-0 python3.9[144695]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:11:13 compute-0 sudo[144692]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:14 compute-0 python3.9[144845]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:11:15 compute-0 sudo[144995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mskysrfkffrxfufvboymaqbvlnotbtep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528274.831168-1099-277899085312388/AnsiballZ_stat.py'
Feb 19 19:11:15 compute-0 sudo[144995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:15 compute-0 python3.9[144998]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:15 compute-0 sudo[144995]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:15 compute-0 sudo[145121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mygfuwyfjscnzerehkkxsbtzvwdkjtnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528274.831168-1099-277899085312388/AnsiballZ_copy.py'
Feb 19 19:11:15 compute-0 sudo[145121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:16 compute-0 python3.9[145124]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771528274.831168-1099-277899085312388/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:16 compute-0 sudo[145121]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:16 compute-0 sudo[145274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmhppafxodsxkpdrmlnolsbfqdpseesq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528276.209962-1099-280909822727972/AnsiballZ_stat.py'
Feb 19 19:11:16 compute-0 sudo[145274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:16 compute-0 python3.9[145277]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:16 compute-0 sudo[145274]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:17 compute-0 sudo[145400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjgecdpnaxrwilfleljvwiypyibgssym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528276.209962-1099-280909822727972/AnsiballZ_copy.py'
Feb 19 19:11:17 compute-0 sudo[145400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:17 compute-0 python3.9[145403]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771528276.209962-1099-280909822727972/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:17 compute-0 sudo[145400]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:17 compute-0 sudo[145553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvdrpswpzxsxlomiaqpfqzeasxhdksne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528277.401799-1099-276542002703717/AnsiballZ_stat.py'
Feb 19 19:11:17 compute-0 sudo[145553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:17 compute-0 python3.9[145556]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:17 compute-0 sudo[145553]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:18 compute-0 sudo[145679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncymsrxbfmvjlyaubafofmklcxsgdpay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528277.401799-1099-276542002703717/AnsiballZ_copy.py'
Feb 19 19:11:18 compute-0 sudo[145679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:18 compute-0 python3.9[145682]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771528277.401799-1099-276542002703717/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:18 compute-0 sudo[145679]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:18 compute-0 sudo[145832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbmdshximcinasaunmcondhzmotmtapg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528278.4802356-1099-186470267544035/AnsiballZ_stat.py'
Feb 19 19:11:18 compute-0 sudo[145832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:18 compute-0 python3.9[145835]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:18 compute-0 sudo[145832]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:19 compute-0 sudo[145958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkiiycjfwvdiofofmqbdjtewzlbacohx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528278.4802356-1099-186470267544035/AnsiballZ_copy.py'
Feb 19 19:11:19 compute-0 sudo[145958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:19 compute-0 python3.9[145961]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771528278.4802356-1099-186470267544035/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:19 compute-0 sudo[145958]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:19 compute-0 sudo[146111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpwemwrinjbzkosaznvujsackxqxosjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528279.6057105-1099-237651479206679/AnsiballZ_stat.py'
Feb 19 19:11:19 compute-0 sudo[146111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:20 compute-0 python3.9[146114]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:20 compute-0 sudo[146111]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:20 compute-0 sudo[146237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sigutdhwjtkwjgvlktwudttnqdujtqri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528279.6057105-1099-237651479206679/AnsiballZ_copy.py'
Feb 19 19:11:20 compute-0 sudo[146237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:20 compute-0 python3.9[146240]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771528279.6057105-1099-237651479206679/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:20 compute-0 sudo[146237]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:20 compute-0 sudo[146390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxmfmhrasyxwqiuvclqpmxxxrcuffhak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528280.5807188-1099-213599071160443/AnsiballZ_stat.py'
Feb 19 19:11:20 compute-0 sudo[146390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:20 compute-0 python3.9[146393]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:21 compute-0 sudo[146390]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:21 compute-0 sudo[146516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btszgjawecudtsvuavmckzkiidipohbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528280.5807188-1099-213599071160443/AnsiballZ_copy.py'
Feb 19 19:11:21 compute-0 sudo[146516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:21 compute-0 python3.9[146519]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771528280.5807188-1099-213599071160443/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:21 compute-0 sudo[146516]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:22 compute-0 sudo[146669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-palewcmelgokbggakoidmrowqscsphzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528281.7376068-1099-269377091506695/AnsiballZ_stat.py'
Feb 19 19:11:22 compute-0 sudo[146669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:22 compute-0 python3.9[146672]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:22 compute-0 sudo[146669]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:22 compute-0 sudo[146793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsietphjqomlqrlylpdedwwauaduenmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528281.7376068-1099-269377091506695/AnsiballZ_copy.py'
Feb 19 19:11:22 compute-0 sudo[146793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:22 compute-0 python3.9[146796]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771528281.7376068-1099-269377091506695/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:22 compute-0 sudo[146793]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:23 compute-0 sudo[146946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heorfqqfcqbksnvhhtusbrwauiseywns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528282.878699-1099-204184151795017/AnsiballZ_stat.py'
Feb 19 19:11:23 compute-0 sudo[146946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:23 compute-0 python3.9[146949]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:23 compute-0 sudo[146946]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:23 compute-0 sudo[147072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfwgxvzsvzvaknntqgwbbbxctlynmxvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528282.878699-1099-204184151795017/AnsiballZ_copy.py'
Feb 19 19:11:23 compute-0 sudo[147072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:23 compute-0 python3.9[147075]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771528282.878699-1099-204184151795017/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:23 compute-0 sudo[147072]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:24 compute-0 sudo[147225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzlkiblznbvnzuayzvddmnxshtmiirre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528284.501094-1325-67233315051660/AnsiballZ_command.py'
Feb 19 19:11:24 compute-0 sudo[147225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:24 compute-0 python3.9[147228]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Feb 19 19:11:25 compute-0 sudo[147225]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:25 compute-0 sudo[147379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbptkllqrrgsksmwvgkfonqvjwpdkcst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528285.2908745-1343-16114229119170/AnsiballZ_file.py'
Feb 19 19:11:25 compute-0 sudo[147379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:25 compute-0 python3.9[147382]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:25 compute-0 sudo[147379]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:26 compute-0 sudo[147532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znbwckmiprtbqzxclbvlqoaoptpijtzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528285.9104967-1343-171907962376038/AnsiballZ_file.py'
Feb 19 19:11:26 compute-0 sudo[147532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:26 compute-0 python3.9[147535]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:26 compute-0 sudo[147532]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:26 compute-0 sudo[147685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cejdxhzrxjzpcegmhmfanrgsntzdwrvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528286.4428315-1343-166139022534247/AnsiballZ_file.py'
Feb 19 19:11:26 compute-0 sudo[147685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:26 compute-0 python3.9[147688]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:26 compute-0 sudo[147685]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:27 compute-0 sudo[147838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bczxlbipddfjlajabqtbiayroofayzzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528286.880603-1343-44809974546889/AnsiballZ_file.py'
Feb 19 19:11:27 compute-0 sudo[147838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:27 compute-0 python3.9[147841]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:27 compute-0 sudo[147838]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:27 compute-0 sudo[147991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njhqumsgtuacazzizwiryuzlqzgnwgnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528287.4448333-1343-268473416384330/AnsiballZ_file.py'
Feb 19 19:11:27 compute-0 sudo[147991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:27 compute-0 python3.9[147994]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:27 compute-0 sudo[147991]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:28 compute-0 sudo[148144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryerzomtjxulukofrrselchavdlhqjwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528287.9981344-1343-140072869255546/AnsiballZ_file.py'
Feb 19 19:11:28 compute-0 sudo[148144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:28 compute-0 python3.9[148147]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:28 compute-0 sudo[148144]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:28 compute-0 sudo[148297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeyqndhqnyrseipikkanubalatydnvbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528288.5180755-1343-108981493772349/AnsiballZ_file.py'
Feb 19 19:11:28 compute-0 sudo[148297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:28 compute-0 python3.9[148300]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:28 compute-0 sudo[148297]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:29 compute-0 sudo[148450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqrpdjmilqzifndelqacdhfaeoqipjop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528289.0561175-1343-10968441721478/AnsiballZ_file.py'
Feb 19 19:11:29 compute-0 sudo[148450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:29 compute-0 python3.9[148453]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:29 compute-0 sudo[148450]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:29 compute-0 sudo[148603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dapcksbmhfrkxxgytwkyknojmqlgrwze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528289.5958195-1343-187840760983475/AnsiballZ_file.py'
Feb 19 19:11:29 compute-0 sudo[148603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:30 compute-0 python3.9[148606]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:30 compute-0 sudo[148603]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:30 compute-0 sudo[148756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxmpkvgoqsnncwvnlsjxawwcidsncezf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528290.151327-1343-80780866449703/AnsiballZ_file.py'
Feb 19 19:11:30 compute-0 sudo[148756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:30 compute-0 python3.9[148759]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:30 compute-0 sudo[148756]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:30 compute-0 sudo[148909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huopchuvaetrtblkyevtnewsujaawees ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528290.6565752-1343-127320536041274/AnsiballZ_file.py'
Feb 19 19:11:30 compute-0 sudo[148909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:31 compute-0 python3.9[148912]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:31 compute-0 sudo[148909]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:31 compute-0 sudo[149062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqbviyvdzikqcuejxfldecuteqqdhebs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528291.146604-1343-216054701983047/AnsiballZ_file.py'
Feb 19 19:11:31 compute-0 sudo[149062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:31 compute-0 python3.9[149065]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:31 compute-0 sudo[149062]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:31 compute-0 sudo[149215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fryjuzixopsnowtiqcloajrsikmgiowo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528291.6569343-1343-196715779543894/AnsiballZ_file.py'
Feb 19 19:11:31 compute-0 sudo[149215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:32 compute-0 python3.9[149218]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:32 compute-0 sudo[149215]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:11:32.095 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:11:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:11:32.095 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:11:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:11:32.096 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:11:32 compute-0 sudo[149369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enyjdmeqqxkxjtoqxfdeybozggtptqgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528292.1804194-1343-13780921461251/AnsiballZ_file.py'
Feb 19 19:11:32 compute-0 sudo[149369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:32 compute-0 python3.9[149372]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:32 compute-0 sudo[149369]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:33 compute-0 podman[149397]: 2026-02-19 19:11:33.258564747 +0000 UTC m=+0.040027585 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 19 19:11:34 compute-0 sudo[149542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ichrgesujddidiipzmdjtbfdkbterfei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528294.619338-1541-257832840817087/AnsiballZ_stat.py'
Feb 19 19:11:34 compute-0 sudo[149542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:35 compute-0 python3.9[149545]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:35 compute-0 sudo[149542]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:35 compute-0 sudo[149666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbmqxhswiutfilthqyfkopxzwdnbhhqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528294.619338-1541-257832840817087/AnsiballZ_copy.py'
Feb 19 19:11:35 compute-0 sudo[149666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:35 compute-0 python3.9[149669]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528294.619338-1541-257832840817087/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:35 compute-0 sudo[149666]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:35 compute-0 sudo[149819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuobowdbvsipjcqeoepzwqjvtpitcito ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528295.6802266-1541-108116334755606/AnsiballZ_stat.py'
Feb 19 19:11:35 compute-0 sudo[149819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:36 compute-0 python3.9[149822]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:36 compute-0 sudo[149819]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:36 compute-0 sudo[149943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utszmalwwhgryycvgyvahsufibyqajzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528295.6802266-1541-108116334755606/AnsiballZ_copy.py'
Feb 19 19:11:36 compute-0 sudo[149943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:36 compute-0 python3.9[149946]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528295.6802266-1541-108116334755606/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:36 compute-0 sudo[149943]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:37 compute-0 sudo[150096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msmbghrawsyqzjloteemhvbqqbgxewvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528296.7843-1541-269801667019023/AnsiballZ_stat.py'
Feb 19 19:11:37 compute-0 sudo[150096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:37 compute-0 python3.9[150099]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:37 compute-0 sudo[150096]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:37 compute-0 sudo[150220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aunlotcdnffnmaklntppihttchxhdwdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528296.7843-1541-269801667019023/AnsiballZ_copy.py'
Feb 19 19:11:37 compute-0 sudo[150220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:37 compute-0 python3.9[150223]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528296.7843-1541-269801667019023/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:37 compute-0 sudo[150220]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:38 compute-0 sudo[150373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovpbxzkgtltrdaozpqsglzcgbpxclisc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528297.792457-1541-248253975598577/AnsiballZ_stat.py'
Feb 19 19:11:38 compute-0 sudo[150373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:38 compute-0 python3.9[150376]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:38 compute-0 sudo[150373]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:38 compute-0 sudo[150497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkkefoqedunpzgzuindwhlxxugolyuov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528297.792457-1541-248253975598577/AnsiballZ_copy.py'
Feb 19 19:11:38 compute-0 sudo[150497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:38 compute-0 python3.9[150500]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528297.792457-1541-248253975598577/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:38 compute-0 sudo[150497]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:39 compute-0 sudo[150669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkzsucfwbmgnrdicdudzlmyolakrqnzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528298.7369578-1541-90698361580164/AnsiballZ_stat.py'
Feb 19 19:11:39 compute-0 sudo[150669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:39 compute-0 podman[150624]: 2026-02-19 19:11:39.041307232 +0000 UTC m=+0.079940087 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 19 19:11:39 compute-0 python3.9[150679]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:39 compute-0 sudo[150669]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:39 compute-0 sudo[150800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qykqdmhsxdyvniywleenyspygbpemvya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528298.7369578-1541-90698361580164/AnsiballZ_copy.py'
Feb 19 19:11:39 compute-0 sudo[150800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:39 compute-0 python3.9[150803]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528298.7369578-1541-90698361580164/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:39 compute-0 sudo[150800]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:40 compute-0 sudo[150953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnqavphehquzmluyggjinhjqhyzjpqxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528299.9013493-1541-22525012765373/AnsiballZ_stat.py'
Feb 19 19:11:40 compute-0 sudo[150953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:40 compute-0 python3.9[150956]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:40 compute-0 sudo[150953]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:40 compute-0 sudo[151077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wohkxwtmglfucpkuitkpgagecujzdglo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528299.9013493-1541-22525012765373/AnsiballZ_copy.py'
Feb 19 19:11:40 compute-0 sudo[151077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:40 compute-0 python3.9[151080]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528299.9013493-1541-22525012765373/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:40 compute-0 sudo[151077]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:41 compute-0 sudo[151230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpthjyrqahglmlqlwrztxmajdxaqrstn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528300.9191027-1541-84933025472375/AnsiballZ_stat.py'
Feb 19 19:11:41 compute-0 sudo[151230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:41 compute-0 python3.9[151233]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:41 compute-0 sudo[151230]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:41 compute-0 sudo[151354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utgljoodgrwydjovutbpxakmudximhlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528300.9191027-1541-84933025472375/AnsiballZ_copy.py'
Feb 19 19:11:41 compute-0 sudo[151354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:41 compute-0 python3.9[151357]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528300.9191027-1541-84933025472375/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:41 compute-0 sudo[151354]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:42 compute-0 sudo[151507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atyienscydobbelhiogptmsarkdvdpvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528301.9574325-1541-39212458964278/AnsiballZ_stat.py'
Feb 19 19:11:42 compute-0 sudo[151507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:42 compute-0 python3.9[151510]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:42 compute-0 sudo[151507]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:42 compute-0 sudo[151631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jadpyeapaqujsetqcquoepgpldxvdazn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528301.9574325-1541-39212458964278/AnsiballZ_copy.py'
Feb 19 19:11:42 compute-0 sudo[151631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:42 compute-0 python3.9[151634]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528301.9574325-1541-39212458964278/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:42 compute-0 sudo[151631]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:43 compute-0 sudo[151784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiuxnvtpjidttbmcrdktfhtzaovvvvbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528303.1417933-1541-106882938761067/AnsiballZ_stat.py'
Feb 19 19:11:43 compute-0 sudo[151784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:43 compute-0 python3.9[151787]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:43 compute-0 sudo[151784]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:43 compute-0 sudo[151908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yirlvwusnuyyfugxtkfjboxvytuerbon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528303.1417933-1541-106882938761067/AnsiballZ_copy.py'
Feb 19 19:11:43 compute-0 sudo[151908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:44 compute-0 python3.9[151911]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528303.1417933-1541-106882938761067/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:44 compute-0 sudo[151908]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:44 compute-0 sudo[152061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrbnfjxjpbnferisdgswjayjalnuvxly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528304.3643186-1541-239625744811903/AnsiballZ_stat.py'
Feb 19 19:11:44 compute-0 sudo[152061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:44 compute-0 python3.9[152064]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:44 compute-0 sudo[152061]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:45 compute-0 sudo[152187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqpzfuhjwvdquzejnmjqbmqwaqmmjbzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528304.3643186-1541-239625744811903/AnsiballZ_copy.py'
Feb 19 19:11:45 compute-0 sudo[152187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:45 compute-0 python3.9[152190]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528304.3643186-1541-239625744811903/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:45 compute-0 sudo[152187]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:45 compute-0 sudo[152340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tihfbkdopnikhkbfckqdlrhsssveaexn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528305.4526534-1541-67231680306451/AnsiballZ_stat.py'
Feb 19 19:11:45 compute-0 sudo[152340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:45 compute-0 python3.9[152343]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:45 compute-0 sudo[152340]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:46 compute-0 sudo[152464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nytftycpiaowmnipxhzawwdqbeuacnyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528305.4526534-1541-67231680306451/AnsiballZ_copy.py'
Feb 19 19:11:46 compute-0 sudo[152464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:46 compute-0 python3.9[152467]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528305.4526534-1541-67231680306451/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:46 compute-0 sudo[152464]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:46 compute-0 sshd-session[152083]: Invalid user nutanix from 182.75.216.74 port 17405
Feb 19 19:11:46 compute-0 sshd-session[152083]: Received disconnect from 182.75.216.74 port 17405:11: Bye Bye [preauth]
Feb 19 19:11:46 compute-0 sshd-session[152083]: Disconnected from invalid user nutanix 182.75.216.74 port 17405 [preauth]
Feb 19 19:11:46 compute-0 sudo[152617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcludufcxmkepqsafobviacliucskjsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528306.470283-1541-33654882217886/AnsiballZ_stat.py'
Feb 19 19:11:46 compute-0 sudo[152617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:46 compute-0 python3.9[152620]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:46 compute-0 sudo[152617]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:47 compute-0 sudo[152741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deyfuupqubxqpxznqleqsxvblisxmvzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528306.470283-1541-33654882217886/AnsiballZ_copy.py'
Feb 19 19:11:47 compute-0 sudo[152741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:47 compute-0 python3.9[152744]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528306.470283-1541-33654882217886/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:47 compute-0 sudo[152741]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:47 compute-0 sudo[152894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvbtdlnwdubraioohmvltfyonrvjhkro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528307.6909344-1541-73915608080507/AnsiballZ_stat.py'
Feb 19 19:11:47 compute-0 sudo[152894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:48 compute-0 python3.9[152897]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:48 compute-0 sudo[152894]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:48 compute-0 sudo[153018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccopgqezoffrnsebpykobftfzriioixn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528307.6909344-1541-73915608080507/AnsiballZ_copy.py'
Feb 19 19:11:48 compute-0 sudo[153018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:48 compute-0 python3.9[153021]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528307.6909344-1541-73915608080507/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:48 compute-0 sudo[153018]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:48 compute-0 sudo[153171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqcfbeoypablaerbvfxzuwobgvdzonin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528308.6269228-1541-64482238565064/AnsiballZ_stat.py'
Feb 19 19:11:48 compute-0 sudo[153171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:49 compute-0 python3.9[153174]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:11:49 compute-0 sudo[153171]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:49 compute-0 sudo[153297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgsbbwffaubjczjfwkbpdlogxmhrxoli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528308.6269228-1541-64482238565064/AnsiballZ_copy.py'
Feb 19 19:11:49 compute-0 sudo[153297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:49 compute-0 sshd-session[153175]: Invalid user ubuntu from 189.165.79.177 port 56998
Feb 19 19:11:49 compute-0 sshd-session[153175]: Received disconnect from 189.165.79.177 port 56998:11: Bye Bye [preauth]
Feb 19 19:11:49 compute-0 sshd-session[153175]: Disconnected from invalid user ubuntu 189.165.79.177 port 56998 [preauth]
Feb 19 19:11:49 compute-0 python3.9[153300]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528308.6269228-1541-64482238565064/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:49 compute-0 sudo[153297]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:51 compute-0 python3.9[153450]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:11:52 compute-0 sudo[153603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvbqetajidnibqyxwwigdnjpegjgcafz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528311.6885703-1953-170431310824374/AnsiballZ_seboolean.py'
Feb 19 19:11:52 compute-0 sudo[153603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:52 compute-0 python3.9[153606]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 19 19:11:53 compute-0 sudo[153603]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:53 compute-0 sudo[153760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geqcntiuhymiaruvriihkiaajgwhesks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528313.4562898-1969-155022908851710/AnsiballZ_copy.py'
Feb 19 19:11:53 compute-0 dbus-broker-launch[804]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Feb 19 19:11:53 compute-0 sudo[153760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:53 compute-0 python3.9[153763]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:53 compute-0 sudo[153760]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:54 compute-0 sudo[153913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkvrtftnyhionttutofcxrqnntgurelt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528314.0022523-1969-110174080367650/AnsiballZ_copy.py'
Feb 19 19:11:54 compute-0 sudo[153913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:54 compute-0 python3.9[153916]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:54 compute-0 sudo[153913]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:54 compute-0 sudo[154066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdisskmhfamaxesesdmqkvgulcekntjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528314.6077044-1969-190981051571217/AnsiballZ_copy.py'
Feb 19 19:11:54 compute-0 sudo[154066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:55 compute-0 python3.9[154069]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:55 compute-0 sudo[154066]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:55 compute-0 sudo[154219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhvtpdrpcofxjadjrwxtcqgytddtnoyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528315.2264187-1969-93172502676002/AnsiballZ_copy.py'
Feb 19 19:11:55 compute-0 sudo[154219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:55 compute-0 python3.9[154222]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:55 compute-0 sudo[154219]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:56 compute-0 sudo[154372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqbsdoorgmpoavwdhgosasixuzgdknxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528315.8238926-1969-192948681269674/AnsiballZ_copy.py'
Feb 19 19:11:56 compute-0 sudo[154372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:56 compute-0 python3.9[154375]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:56 compute-0 sudo[154372]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:56 compute-0 sudo[154525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsjmouueiuihubzlcuhleifibgrccxpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528316.7481701-2041-235352793482044/AnsiballZ_copy.py'
Feb 19 19:11:56 compute-0 sudo[154525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:57 compute-0 python3.9[154528]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:57 compute-0 sudo[154525]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:57 compute-0 sudo[154678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqufjuxabsamvrtgpqaqcxflqaqewcnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528317.3675976-2041-87341147735782/AnsiballZ_copy.py'
Feb 19 19:11:57 compute-0 sudo[154678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:57 compute-0 python3.9[154681]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:57 compute-0 sudo[154678]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:58 compute-0 sudo[154831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmdukijmsyjasbnqdamvntcdmauaupyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528317.8315568-2041-227571151201660/AnsiballZ_copy.py'
Feb 19 19:11:58 compute-0 sudo[154831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:58 compute-0 python3.9[154834]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:58 compute-0 sudo[154831]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:58 compute-0 sudo[154984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-watmrgkxugthuwlyrqgypqtxhyjrxfzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528318.3436651-2041-7526672341081/AnsiballZ_copy.py'
Feb 19 19:11:58 compute-0 sudo[154984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:58 compute-0 python3.9[154987]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:58 compute-0 sudo[154984]: pam_unix(sudo:session): session closed for user root
Feb 19 19:11:59 compute-0 sudo[155137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-royjejvpnzxdifhqybxrwjalxksflwvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528318.8696826-2041-249694233540520/AnsiballZ_copy.py'
Feb 19 19:11:59 compute-0 sudo[155137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:11:59 compute-0 python3.9[155140]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:11:59 compute-0 sudo[155137]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:00 compute-0 sudo[155290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpwishrodmzhxtwrinqqtocohodrgddr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528320.1708634-2113-22926298310264/AnsiballZ_systemd.py'
Feb 19 19:12:00 compute-0 sudo[155290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:00 compute-0 python3.9[155293]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:12:00 compute-0 systemd[1]: Reloading.
Feb 19 19:12:00 compute-0 systemd-sysv-generator[155322]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:12:00 compute-0 systemd-rc-local-generator[155317]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:12:00 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Feb 19 19:12:00 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Feb 19 19:12:00 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Feb 19 19:12:00 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Feb 19 19:12:00 compute-0 systemd[1]: Starting libvirt logging daemon...
Feb 19 19:12:00 compute-0 systemd[1]: Started libvirt logging daemon.
Feb 19 19:12:01 compute-0 sudo[155290]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:01 compute-0 sudo[155492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftlkgptylveifdsdxtjkkajrrybnboaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528321.1431205-2113-31770645398134/AnsiballZ_systemd.py'
Feb 19 19:12:01 compute-0 sudo[155492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:01 compute-0 python3.9[155495]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:12:01 compute-0 systemd[1]: Reloading.
Feb 19 19:12:01 compute-0 systemd-rc-local-generator[155519]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:12:01 compute-0 systemd-sysv-generator[155525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:12:01 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Feb 19 19:12:01 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Feb 19 19:12:01 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Feb 19 19:12:01 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Feb 19 19:12:01 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Feb 19 19:12:01 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Feb 19 19:12:01 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 19 19:12:01 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 19 19:12:01 compute-0 sudo[155492]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:02 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 19 19:12:02 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 19 19:12:02 compute-0 sudo[155717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofpwjyfuxbetuawmomeprxnjnfgupfes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528322.0891926-2113-126327562452833/AnsiballZ_systemd.py'
Feb 19 19:12:02 compute-0 sudo[155717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:02 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Feb 19 19:12:02 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Feb 19 19:12:02 compute-0 python3.9[155720]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:12:02 compute-0 systemd[1]: Reloading.
Feb 19 19:12:02 compute-0 systemd-sysv-generator[155754]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:12:02 compute-0 systemd-rc-local-generator[155750]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:12:02 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Feb 19 19:12:02 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Feb 19 19:12:02 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Feb 19 19:12:02 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Feb 19 19:12:02 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 19 19:12:02 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 19 19:12:02 compute-0 sudo[155717]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:03 compute-0 sudo[155944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izwchfgdposxvwvjmcsonuyfmuxhqzoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528323.095561-2113-172118608229273/AnsiballZ_systemd.py'
Feb 19 19:12:03 compute-0 sudo[155944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:03 compute-0 podman[155946]: 2026-02-19 19:12:03.337537031 +0000 UTC m=+0.044543891 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 19:12:03 compute-0 setroubleshoot[155569]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 9809f410-5029-43b1-b3e6-e6bec0c42852
Feb 19 19:12:03 compute-0 setroubleshoot[155569]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 19 19:12:03 compute-0 setroubleshoot[155569]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 9809f410-5029-43b1-b3e6-e6bec0c42852
Feb 19 19:12:03 compute-0 rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 19:12:03 compute-0 rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 19:12:03 compute-0 setroubleshoot[155569]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 19 19:12:03 compute-0 python3.9[155948]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:12:03 compute-0 systemd[1]: Reloading.
Feb 19 19:12:03 compute-0 systemd-rc-local-generator[155990]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:12:03 compute-0 systemd-sysv-generator[155997]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:12:03 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Feb 19 19:12:03 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Feb 19 19:12:03 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Feb 19 19:12:03 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Feb 19 19:12:03 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Feb 19 19:12:03 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Feb 19 19:12:03 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Feb 19 19:12:03 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Feb 19 19:12:03 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Feb 19 19:12:03 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Feb 19 19:12:03 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 19 19:12:03 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 19 19:12:03 compute-0 sudo[155944]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:04 compute-0 sudo[156187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwlbtijcxjbkzoqvhycuqimruspcjywh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528324.030106-2113-64939642356418/AnsiballZ_systemd.py'
Feb 19 19:12:04 compute-0 sudo[156187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:04 compute-0 python3.9[156190]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:12:04 compute-0 systemd[1]: Reloading.
Feb 19 19:12:04 compute-0 systemd-rc-local-generator[156209]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:12:04 compute-0 systemd-sysv-generator[156215]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:12:04 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Feb 19 19:12:04 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Feb 19 19:12:04 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Feb 19 19:12:04 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Feb 19 19:12:04 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Feb 19 19:12:04 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Feb 19 19:12:04 compute-0 systemd[1]: Starting libvirt secret daemon...
Feb 19 19:12:04 compute-0 systemd[1]: Started libvirt secret daemon.
Feb 19 19:12:04 compute-0 sudo[156187]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:06 compute-0 sudo[156406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oshitpykuzllbdtaedijibkzvekwoltn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528325.79374-2187-3343250264195/AnsiballZ_file.py'
Feb 19 19:12:06 compute-0 sudo[156406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:06 compute-0 python3.9[156409]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:06 compute-0 sudo[156406]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:06 compute-0 sudo[156559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tojftyljccruuyzfttptxvemvnndyauy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528326.468438-2203-179056725180967/AnsiballZ_find.py'
Feb 19 19:12:06 compute-0 sudo[156559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:06 compute-0 python3.9[156562]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 19 19:12:06 compute-0 sudo[156559]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:07 compute-0 sshd-session[156563]: Received disconnect from 91.224.92.108 port 24026:11:  [preauth]
Feb 19 19:12:07 compute-0 sshd-session[156563]: Disconnected from authenticating user root 91.224.92.108 port 24026 [preauth]
Feb 19 19:12:07 compute-0 sudo[156714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywzhmsjnzwapqmgsrynxymiqfyotlmbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528327.4098608-2231-239456165089886/AnsiballZ_stat.py'
Feb 19 19:12:07 compute-0 sudo[156714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:07 compute-0 python3.9[156717]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:12:07 compute-0 sudo[156714]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:08 compute-0 sudo[156838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uakbhldzyckdkivrrnuzuenhbszmjnuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528327.4098608-2231-239456165089886/AnsiballZ_copy.py'
Feb 19 19:12:08 compute-0 sudo[156838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:08 compute-0 python3.9[156841]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528327.4098608-2231-239456165089886/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:08 compute-0 sudo[156838]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:08 compute-0 sudo[156991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfbehzsfwbgiapvxrirdhznporclmmgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528328.6732998-2263-24370173782861/AnsiballZ_file.py'
Feb 19 19:12:08 compute-0 sudo[156991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:09 compute-0 python3.9[156994]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:09 compute-0 sudo[156991]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:09 compute-0 podman[156995]: 2026-02-19 19:12:09.191700901 +0000 UTC m=+0.066616705 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216)
Feb 19 19:12:09 compute-0 sudo[157171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssfbxthrqbravbdnghneaytgklarjlru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528329.319678-2279-131916670442686/AnsiballZ_stat.py'
Feb 19 19:12:09 compute-0 sudo[157171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:09 compute-0 python3.9[157174]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:12:09 compute-0 sudo[157171]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:09 compute-0 sudo[157250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sydsjifnkbocmszvgbaqvdwpqxksmpgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528329.319678-2279-131916670442686/AnsiballZ_file.py'
Feb 19 19:12:09 compute-0 sudo[157250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:10 compute-0 python3.9[157253]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:10 compute-0 sudo[157250]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:10 compute-0 sudo[157403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvhvkboqfadbaisljwocoznojqdvhrkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528330.3956864-2303-254274185361433/AnsiballZ_stat.py'
Feb 19 19:12:10 compute-0 sudo[157403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:10 compute-0 python3.9[157406]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:12:10 compute-0 sudo[157403]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:11 compute-0 sudo[157482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvkqtaagrbrrvcviklcdmdzhuaqrqrll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528330.3956864-2303-254274185361433/AnsiballZ_file.py'
Feb 19 19:12:11 compute-0 sudo[157482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:11 compute-0 python3.9[157485]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.rpu4rjma recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:11 compute-0 sudo[157482]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:11 compute-0 sudo[157635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grsvvgilzctyrxoolcwoojtibnwwdawl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528331.4535306-2327-228989932194962/AnsiballZ_stat.py'
Feb 19 19:12:11 compute-0 sudo[157635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:11 compute-0 python3.9[157638]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:12:11 compute-0 sudo[157635]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:12 compute-0 sudo[157714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdlrjhftxxiinenmiqjzegxrklivbvwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528331.4535306-2327-228989932194962/AnsiballZ_file.py'
Feb 19 19:12:12 compute-0 sudo[157714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:12 compute-0 python3.9[157717]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:12 compute-0 sudo[157714]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:12 compute-0 sudo[157867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfjjevaoafokmpepivzwlbmiwmljuaux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528332.5888422-2353-139842331305003/AnsiballZ_command.py'
Feb 19 19:12:12 compute-0 sudo[157867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:12 compute-0 python3.9[157870]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:12:12 compute-0 sudo[157867]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:13 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Feb 19 19:12:13 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 19 19:12:13 compute-0 sudo[158021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsliwnluepycgcqlzrruntzdfvsjvvav ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771528333.2860777-2369-161248347982502/AnsiballZ_edpm_nftables_from_files.py'
Feb 19 19:12:13 compute-0 sudo[158021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:13 compute-0 python3[158024]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 19 19:12:13 compute-0 sudo[158021]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:14 compute-0 sudo[158174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfqdprenmfhayfwjrphcyzsbtpgrjjmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528334.0735629-2385-193463829134130/AnsiballZ_stat.py'
Feb 19 19:12:14 compute-0 sudo[158174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:14 compute-0 python3.9[158177]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:12:14 compute-0 sudo[158174]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:14 compute-0 sudo[158253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sukvktdifvtucbqvgpxqruhyjweacbpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528334.0735629-2385-193463829134130/AnsiballZ_file.py'
Feb 19 19:12:14 compute-0 sudo[158253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:14 compute-0 python3.9[158256]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:15 compute-0 sudo[158253]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:15 compute-0 sudo[158406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjlbcpqengnnzirwgdzmuftrfjhgkvao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528335.2803357-2409-201441609531733/AnsiballZ_stat.py'
Feb 19 19:12:15 compute-0 sudo[158406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:15 compute-0 python3.9[158409]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:12:15 compute-0 sudo[158406]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:16 compute-0 sudo[158532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtliwhkhoykwlprwldwwfznbdqylckmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528335.2803357-2409-201441609531733/AnsiballZ_copy.py'
Feb 19 19:12:16 compute-0 sudo[158532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:16 compute-0 python3.9[158535]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528335.2803357-2409-201441609531733/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:16 compute-0 sudo[158532]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:16 compute-0 sudo[158685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msmefkczyawewxgdvknbxvagclzwjyjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528336.5003304-2439-101046790413082/AnsiballZ_stat.py'
Feb 19 19:12:16 compute-0 sudo[158685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:16 compute-0 python3.9[158688]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:12:16 compute-0 sudo[158685]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:17 compute-0 sudo[158764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfjbfsioufpruvfpdwgisxfcvlfwpmgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528336.5003304-2439-101046790413082/AnsiballZ_file.py'
Feb 19 19:12:17 compute-0 sudo[158764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:17 compute-0 python3.9[158767]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:17 compute-0 sudo[158764]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:17 compute-0 sudo[158917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kumyyqihztuivdjniizofrjxsaxdfozu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528337.470594-2463-134116830584417/AnsiballZ_stat.py'
Feb 19 19:12:17 compute-0 sudo[158917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:17 compute-0 python3.9[158920]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:12:17 compute-0 sudo[158917]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:18 compute-0 sudo[158996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifdfmzjhtlubfojskeqdipogvythpmmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528337.470594-2463-134116830584417/AnsiballZ_file.py'
Feb 19 19:12:18 compute-0 sudo[158996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:18 compute-0 python3.9[158999]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:18 compute-0 sudo[158996]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:18 compute-0 sudo[159149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfbllxbfuksxvbqxlaacnuoggnuhxadw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528338.5238726-2487-183638405717858/AnsiballZ_stat.py'
Feb 19 19:12:18 compute-0 sudo[159149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:19 compute-0 python3.9[159152]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:12:19 compute-0 sudo[159149]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:19 compute-0 sudo[159275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osuzzfmrebmmeuaegriqkjqxykvkofao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528338.5238726-2487-183638405717858/AnsiballZ_copy.py'
Feb 19 19:12:19 compute-0 sudo[159275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:19 compute-0 python3.9[159278]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528338.5238726-2487-183638405717858/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:19 compute-0 sudo[159275]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:20 compute-0 sudo[159428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfykbxzdmslpyqzfsolbvzlmsntnropu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528339.7554345-2517-100836058628778/AnsiballZ_file.py'
Feb 19 19:12:20 compute-0 sudo[159428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:20 compute-0 python3.9[159431]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:20 compute-0 sudo[159428]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:20 compute-0 sudo[159581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceyzhgtpfgvewvojpwfxgcthfhmlritc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528340.4244351-2533-70848682413359/AnsiballZ_command.py'
Feb 19 19:12:20 compute-0 sudo[159581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:20 compute-0 python3.9[159584]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:12:20 compute-0 sudo[159581]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:21 compute-0 sudo[159737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaqbsmdnyxjrnnxukqdoodbibkveserc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528341.0815454-2549-5000482443596/AnsiballZ_blockinfile.py'
Feb 19 19:12:21 compute-0 sudo[159737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:21 compute-0 python3.9[159740]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:21 compute-0 sudo[159737]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:22 compute-0 sudo[159890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuqgqqjkbfshfqjvfoiuudeudmwmylah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528342.0091605-2567-228045500999133/AnsiballZ_command.py'
Feb 19 19:12:22 compute-0 sudo[159890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:22 compute-0 python3.9[159893]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:12:22 compute-0 sudo[159890]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:22 compute-0 sudo[160044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmgenjcmozeylpnzazpibuzbzmoxnsoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528342.7087505-2583-149846528793850/AnsiballZ_stat.py'
Feb 19 19:12:22 compute-0 sudo[160044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:23 compute-0 python3.9[160047]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:12:23 compute-0 sudo[160044]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:23 compute-0 sudo[160199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbcrehephphtyadqjxkwdcrbyfnlmpxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528343.2981749-2599-46731347468560/AnsiballZ_command.py'
Feb 19 19:12:23 compute-0 sudo[160199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:23 compute-0 python3.9[160202]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:12:23 compute-0 sudo[160199]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:24 compute-0 sudo[160355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jphugdkooddytytyoukqlojqrjaurqlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528344.0256355-2615-163417286901338/AnsiballZ_file.py'
Feb 19 19:12:24 compute-0 sudo[160355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:24 compute-0 python3.9[160358]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:24 compute-0 sudo[160355]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:25 compute-0 sudo[160508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oekxvkxngsuntacfojgkcuamorholibz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528344.852808-2631-259252313158783/AnsiballZ_stat.py'
Feb 19 19:12:25 compute-0 sudo[160508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:25 compute-0 python3.9[160511]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:12:25 compute-0 sudo[160508]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:25 compute-0 sudo[160632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijkopuhsfwagwmxsdgbmuilwgrvqgtem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528344.852808-2631-259252313158783/AnsiballZ_copy.py'
Feb 19 19:12:25 compute-0 sudo[160632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:25 compute-0 python3.9[160635]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528344.852808-2631-259252313158783/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:25 compute-0 sudo[160632]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:26 compute-0 sudo[160785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdhmlvfzdmizsclrcwzsqtxukqsvigst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528346.015592-2661-265773920564029/AnsiballZ_stat.py'
Feb 19 19:12:26 compute-0 sudo[160785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:26 compute-0 python3.9[160788]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:12:26 compute-0 sudo[160785]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:26 compute-0 sudo[160909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbdaebuctxhryvuthulomdcvphqoyjek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528346.015592-2661-265773920564029/AnsiballZ_copy.py'
Feb 19 19:12:26 compute-0 sudo[160909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:26 compute-0 python3.9[160912]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528346.015592-2661-265773920564029/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:26 compute-0 sudo[160909]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:27 compute-0 sudo[161062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inweqmbfbcgamghswwilegguhyidwefv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528347.204268-2691-265508554639051/AnsiballZ_stat.py'
Feb 19 19:12:27 compute-0 sudo[161062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:27 compute-0 python3.9[161065]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:12:27 compute-0 sudo[161062]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:27 compute-0 sudo[161186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdzunjgzavtklsoqhiaekapkcaodftew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528347.204268-2691-265508554639051/AnsiballZ_copy.py'
Feb 19 19:12:27 compute-0 sudo[161186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:28 compute-0 python3.9[161189]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528347.204268-2691-265508554639051/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:28 compute-0 sudo[161186]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:28 compute-0 sudo[161339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysvivcgfmgqegnwcelpdjrvcgaqxmsnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528348.446049-2721-155211144048219/AnsiballZ_systemd.py'
Feb 19 19:12:28 compute-0 sudo[161339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:28 compute-0 python3.9[161342]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:12:28 compute-0 systemd[1]: Reloading.
Feb 19 19:12:29 compute-0 systemd-rc-local-generator[161365]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:12:29 compute-0 systemd-sysv-generator[161370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:12:29 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Feb 19 19:12:29 compute-0 sudo[161339]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:29 compute-0 sudo[161538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeqiqweuoxydqvnfraqbavqxobhmzahd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528349.509939-2737-22623973804954/AnsiballZ_systemd.py'
Feb 19 19:12:29 compute-0 sudo[161538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:29 compute-0 python3.9[161541]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 19 19:12:30 compute-0 systemd[1]: Reloading.
Feb 19 19:12:30 compute-0 systemd-rc-local-generator[161560]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:12:30 compute-0 systemd-sysv-generator[161567]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:12:30 compute-0 systemd[1]: Reloading.
Feb 19 19:12:30 compute-0 systemd-sysv-generator[161617]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:12:30 compute-0 systemd-rc-local-generator[161614]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:12:30 compute-0 sudo[161538]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:31 compute-0 sshd-session[106572]: Connection closed by 192.168.122.30 port 55414
Feb 19 19:12:31 compute-0 sshd-session[106569]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:12:31 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Feb 19 19:12:31 compute-0 systemd[1]: session-23.scope: Consumed 2min 44.060s CPU time.
Feb 19 19:12:31 compute-0 systemd-logind[822]: Session 23 logged out. Waiting for processes to exit.
Feb 19 19:12:31 compute-0 systemd-logind[822]: Removed session 23.
Feb 19 19:12:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:12:32.097 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:12:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:12:32.098 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:12:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:12:32.098 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:12:34 compute-0 podman[161653]: 2026-02-19 19:12:34.298379572 +0000 UTC m=+0.072159621 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Feb 19 19:12:36 compute-0 sshd-session[161672]: Accepted publickey for zuul from 192.168.122.30 port 56434 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:12:36 compute-0 systemd-logind[822]: New session 24 of user zuul.
Feb 19 19:12:36 compute-0 systemd[1]: Started Session 24 of User zuul.
Feb 19 19:12:36 compute-0 sshd-session[161672]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:12:37 compute-0 python3.9[161825]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:12:38 compute-0 python3.9[161979]: ansible-ansible.builtin.service_facts Invoked
Feb 19 19:12:38 compute-0 network[161996]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 19 19:12:38 compute-0 network[161997]: 'network-scripts' will be removed from distribution in near future.
Feb 19 19:12:38 compute-0 network[161998]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 19 19:12:39 compute-0 podman[162004]: 2026-02-19 19:12:39.663353578 +0000 UTC m=+0.124700929 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 19 19:12:43 compute-0 sudo[162295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdxyryccilxtylazqubwkrkdduhjhtum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528363.2793005-69-25469885780526/AnsiballZ_setup.py'
Feb 19 19:12:43 compute-0 sudo[162295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:43 compute-0 python3.9[162298]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 19 19:12:44 compute-0 sudo[162295]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:44 compute-0 sudo[162380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygbkxbbjfuftdqbbvuwufvrftsafbphe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528363.2793005-69-25469885780526/AnsiballZ_dnf.py'
Feb 19 19:12:44 compute-0 sudo[162380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:44 compute-0 python3.9[162383]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:12:49 compute-0 sudo[162380]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:50 compute-0 sudo[162534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frdtjodebmvirhcohwjlxkcwfdykttun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528369.9614146-93-90864311500406/AnsiballZ_stat.py'
Feb 19 19:12:50 compute-0 sudo[162534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:50 compute-0 python3.9[162537]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:12:50 compute-0 sudo[162534]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:51 compute-0 sudo[162687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvsydwhrsghkrzfnyzowmletflqmhgzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528370.8766046-113-210463292346090/AnsiballZ_command.py'
Feb 19 19:12:51 compute-0 sudo[162687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:51 compute-0 python3.9[162690]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:12:51 compute-0 sudo[162687]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:52 compute-0 sudo[162841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfzyguykuohnaxqbaoyagnvlapjpasao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528371.815388-133-105933229078529/AnsiballZ_stat.py'
Feb 19 19:12:52 compute-0 sudo[162841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:52 compute-0 python3.9[162844]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:12:52 compute-0 sudo[162841]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:52 compute-0 sudo[162994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psmtlcwsubzcxbljloxhxsulcsfckzmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528372.491951-149-119953398626619/AnsiballZ_command.py'
Feb 19 19:12:52 compute-0 sudo[162994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:52 compute-0 python3.9[162997]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:12:52 compute-0 sudo[162994]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:53 compute-0 sudo[163148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpmfgvfitmubrhjsxclcquhvonveheol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528373.1372414-165-218303693535427/AnsiballZ_stat.py'
Feb 19 19:12:53 compute-0 sudo[163148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:53 compute-0 python3.9[163151]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:12:53 compute-0 sudo[163148]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:54 compute-0 sudo[163272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leazdzyahgthsggclmwtoenqsfmhagvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528373.1372414-165-218303693535427/AnsiballZ_copy.py'
Feb 19 19:12:54 compute-0 sudo[163272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:54 compute-0 python3.9[163275]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528373.1372414-165-218303693535427/.source.iscsi _original_basename=.3cetxcsn follow=False checksum=a65f5ca852f3c0beec9db5f4b94e17585588ddff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:54 compute-0 sudo[163272]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:54 compute-0 sudo[163425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfqvknijekpvschuxxrifahrnqhlgoju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528374.5323238-195-161884706359559/AnsiballZ_file.py'
Feb 19 19:12:54 compute-0 sudo[163425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:55 compute-0 python3.9[163428]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:55 compute-0 sudo[163425]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:55 compute-0 sudo[163578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzywedlnnhnndmpazmmzfeurgpkikhqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528375.3777974-211-249251227608513/AnsiballZ_lineinfile.py'
Feb 19 19:12:55 compute-0 sudo[163578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:55 compute-0 python3.9[163581]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:12:55 compute-0 sudo[163578]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:56 compute-0 sudo[163731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owulmficdfencfjgelvavpowtttewjwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528376.2554317-229-121303799687331/AnsiballZ_systemd_service.py'
Feb 19 19:12:56 compute-0 sudo[163731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:57 compute-0 python3.9[163734]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:12:57 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Feb 19 19:12:57 compute-0 sudo[163731]: pam_unix(sudo:session): session closed for user root
Feb 19 19:12:57 compute-0 sudo[163888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttwxqqkeycsafkadhceegkynkivxsgkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528377.4247162-245-14926689429934/AnsiballZ_systemd_service.py'
Feb 19 19:12:57 compute-0 sudo[163888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:12:57 compute-0 python3.9[163891]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:12:58 compute-0 systemd[1]: Reloading.
Feb 19 19:12:58 compute-0 systemd-sysv-generator[163918]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:12:58 compute-0 systemd-rc-local-generator[163910]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:12:59 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 19 19:12:59 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 19 19:12:59 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Feb 19 19:12:59 compute-0 systemd[1]: Started Open-iSCSI.
Feb 19 19:12:59 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Feb 19 19:12:59 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Feb 19 19:12:59 compute-0 sudo[163888]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:00 compute-0 python3.9[164099]: ansible-ansible.builtin.service_facts Invoked
Feb 19 19:13:00 compute-0 network[164116]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 19 19:13:00 compute-0 network[164117]: 'network-scripts' will be removed from distribution in near future.
Feb 19 19:13:00 compute-0 network[164118]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 19 19:13:03 compute-0 sudo[164388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trutvgwoakocdlupjqwwpfjiuqrymnbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528383.6311285-291-106868014751430/AnsiballZ_dnf.py'
Feb 19 19:13:03 compute-0 sudo[164388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:04 compute-0 python3.9[164391]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:13:05 compute-0 podman[164393]: 2026-02-19 19:13:05.304891684 +0000 UTC m=+0.080291422 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:13:06 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 19 19:13:06 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 19 19:13:06 compute-0 systemd[1]: Reloading.
Feb 19 19:13:06 compute-0 systemd-sysv-generator[164458]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:13:06 compute-0 systemd-rc-local-generator[164455]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:13:06 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 19 19:13:06 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 19 19:13:06 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 19 19:13:06 compute-0 systemd[1]: run-ref96882638c14584ba4ed4887cbf7406.service: Deactivated successfully.
Feb 19 19:13:06 compute-0 sudo[164388]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:07 compute-0 sudo[164741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whspslhneysrwjtgotgrpzyfniadzzwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528387.2561016-309-227915382971292/AnsiballZ_file.py'
Feb 19 19:13:07 compute-0 sudo[164741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:07 compute-0 python3.9[164744]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 19 19:13:07 compute-0 sudo[164741]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:08 compute-0 sudo[164894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmtnomjobicxiuwozkqpzdhdecsaxafj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528387.9430637-325-111031938009130/AnsiballZ_modprobe.py'
Feb 19 19:13:08 compute-0 sudo[164894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:08 compute-0 python3.9[164897]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 19 19:13:08 compute-0 sudo[164894]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:09 compute-0 sudo[165051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjdpdcuycwoerihbkcuagzipvhtrwimx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528388.7212305-341-273802742533909/AnsiballZ_stat.py'
Feb 19 19:13:09 compute-0 sudo[165051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:09 compute-0 python3.9[165054]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:13:09 compute-0 sudo[165051]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:09 compute-0 sudo[165175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-intkeqczahmvgsoqpftywrhbvrjvzhnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528388.7212305-341-273802742533909/AnsiballZ_copy.py'
Feb 19 19:13:09 compute-0 sudo[165175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:09 compute-0 python3.9[165178]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528388.7212305-341-273802742533909/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:09 compute-0 sudo[165175]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:09 compute-0 podman[165179]: 2026-02-19 19:13:09.832743077 +0000 UTC m=+0.082187398 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 19 19:13:10 compute-0 sudo[165357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joisoxhvtwfqbhawfqnpntulpqqedopp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528390.0875742-373-122927795383962/AnsiballZ_lineinfile.py'
Feb 19 19:13:10 compute-0 sudo[165357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:10 compute-0 python3.9[165360]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:10 compute-0 sudo[165357]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:11 compute-0 sudo[165510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvpsxzymzrehbdhaeheglhrjdncmuokf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528390.804614-389-108528577545968/AnsiballZ_systemd.py'
Feb 19 19:13:11 compute-0 sudo[165510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:11 compute-0 python3.9[165513]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:13:11 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 19 19:13:11 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 19 19:13:11 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 19 19:13:11 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 19 19:13:11 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 19 19:13:11 compute-0 sudo[165510]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:12 compute-0 sudo[165667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlxezlrzzgifolgqdkwrqpgiwzqxcmen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528391.8786714-405-263669701555509/AnsiballZ_command.py'
Feb 19 19:13:12 compute-0 sudo[165667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:12 compute-0 python3.9[165670]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:13:12 compute-0 sudo[165667]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:12 compute-0 sudo[165821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayjhlmogooyaalmpfeyddqphgsxliyjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528392.5994465-425-112901283174212/AnsiballZ_stat.py'
Feb 19 19:13:12 compute-0 sudo[165821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:12 compute-0 python3.9[165824]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:13:12 compute-0 sudo[165821]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:13 compute-0 sudo[165974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zufvqwpscbcetyldvlipnyvsiikavqym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528393.2994378-443-10027346950042/AnsiballZ_stat.py'
Feb 19 19:13:13 compute-0 sudo[165974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:13 compute-0 python3.9[165977]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:13:13 compute-0 sudo[165974]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:14 compute-0 sudo[166098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eytkngstwdmzartckmimolbxmuwerkum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528393.2994378-443-10027346950042/AnsiballZ_copy.py'
Feb 19 19:13:14 compute-0 sudo[166098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:14 compute-0 python3.9[166101]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528393.2994378-443-10027346950042/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:14 compute-0 sudo[166098]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:14 compute-0 sudo[166251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mokqjablmcbtbaftxltvwubzmqpnznbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528394.5173018-473-27592185624968/AnsiballZ_command.py'
Feb 19 19:13:14 compute-0 sudo[166251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:14 compute-0 python3.9[166254]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:13:14 compute-0 sudo[166251]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:15 compute-0 sudo[166405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arygrgwxutdomisidozmwaiwilsoqouf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528395.2552536-489-3991563878194/AnsiballZ_lineinfile.py'
Feb 19 19:13:15 compute-0 sudo[166405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:15 compute-0 python3.9[166408]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:15 compute-0 sudo[166405]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:16 compute-0 sudo[166558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcmdeuqczlqwxyqrgqmuendxhfhkuafo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528395.9628851-505-175104809263810/AnsiballZ_replace.py'
Feb 19 19:13:16 compute-0 sudo[166558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:16 compute-0 python3.9[166561]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:16 compute-0 sudo[166558]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:16 compute-0 sudo[166711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsgshsbjqawuomtfbygjuzglhhgebgej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528396.738898-521-9461288648424/AnsiballZ_replace.py'
Feb 19 19:13:16 compute-0 sudo[166711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:17 compute-0 python3.9[166714]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:17 compute-0 sudo[166711]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:17 compute-0 sudo[166864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldxxugrvvzojeukxpaqjvbenvckiedxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528397.4578562-539-71391045104545/AnsiballZ_lineinfile.py'
Feb 19 19:13:17 compute-0 sudo[166864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:17 compute-0 python3.9[166867]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:17 compute-0 sudo[166864]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:18 compute-0 sudo[167017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cizfdqepmotnkweerxxayugneanljbys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528398.024451-539-237691888911535/AnsiballZ_lineinfile.py'
Feb 19 19:13:18 compute-0 sudo[167017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:18 compute-0 python3.9[167020]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:18 compute-0 sudo[167017]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:18 compute-0 sudo[167170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihqivcuegpgwiqnfmwjaemhcvfzilxba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528398.565775-539-207498977877929/AnsiballZ_lineinfile.py'
Feb 19 19:13:18 compute-0 sudo[167170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:19 compute-0 python3.9[167173]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:19 compute-0 sudo[167170]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:19 compute-0 sudo[167323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpxhbuswtvaqydjtqhxlnvtsgnzljtqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528399.1716254-539-215193769786811/AnsiballZ_lineinfile.py'
Feb 19 19:13:19 compute-0 sudo[167323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:19 compute-0 python3.9[167326]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:19 compute-0 sudo[167323]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:20 compute-0 sudo[167476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydqdsdbtcbrqdjgewufxwattotvelonc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528400.026135-597-242852928778326/AnsiballZ_stat.py'
Feb 19 19:13:20 compute-0 sudo[167476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:20 compute-0 python3.9[167479]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:13:20 compute-0 sudo[167476]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:20 compute-0 sudo[167631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwtctujexqzwazhxwyfygovxnotglfgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528400.618635-613-151117329157275/AnsiballZ_command.py'
Feb 19 19:13:20 compute-0 sudo[167631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:20 compute-0 python3.9[167634]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:13:21 compute-0 sudo[167631]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:21 compute-0 sudo[167785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neqneyeuijmjskywdbczhjpvppaggxnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528401.2247705-631-85422892232668/AnsiballZ_systemd_service.py'
Feb 19 19:13:21 compute-0 sudo[167785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:21 compute-0 python3.9[167788]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:13:21 compute-0 systemd[1]: Listening on multipathd control socket.
Feb 19 19:13:21 compute-0 sudo[167785]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:22 compute-0 sudo[167942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymsgawxxtqoacnlhdrkdggjfydgddcvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528402.0319467-647-132181290251978/AnsiballZ_systemd_service.py'
Feb 19 19:13:22 compute-0 sudo[167942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:22 compute-0 python3.9[167945]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:13:22 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Feb 19 19:13:22 compute-0 udevadm[167950]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Feb 19 19:13:22 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Feb 19 19:13:22 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 19 19:13:22 compute-0 multipathd[167953]: --------start up--------
Feb 19 19:13:22 compute-0 multipathd[167953]: read /etc/multipath.conf
Feb 19 19:13:22 compute-0 multipathd[167953]: path checkers start up
Feb 19 19:13:22 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 19 19:13:22 compute-0 sudo[167942]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:23 compute-0 sudo[168111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sddzozcuwosfygnjbvpmartxsmsmzwum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528403.0593886-671-5228391412099/AnsiballZ_file.py'
Feb 19 19:13:23 compute-0 sudo[168111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:23 compute-0 python3.9[168114]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 19 19:13:23 compute-0 sudo[168111]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:23 compute-0 sudo[168264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxxevveixvwhsqwbrrvpsuvzamueqjpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528403.7218916-687-23314513027475/AnsiballZ_modprobe.py'
Feb 19 19:13:23 compute-0 sudo[168264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:24 compute-0 python3.9[168267]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 19 19:13:24 compute-0 kernel: Key type psk registered
Feb 19 19:13:24 compute-0 sudo[168264]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:24 compute-0 sudo[168427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kogeqlwrnuuxtliceaonfexvmoyjlnap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528404.3288505-703-125472296505226/AnsiballZ_stat.py'
Feb 19 19:13:24 compute-0 sudo[168427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:24 compute-0 python3.9[168430]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:13:24 compute-0 sudo[168427]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:24 compute-0 sudo[168551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyrigcycoxmyurpjchnptovgcnjnbjty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528404.3288505-703-125472296505226/AnsiballZ_copy.py'
Feb 19 19:13:24 compute-0 sudo[168551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:25 compute-0 python3.9[168554]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528404.3288505-703-125472296505226/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:25 compute-0 sudo[168551]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:25 compute-0 sudo[168704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nddjkgnislgqbquyrcxnqmlrmmrxohfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528405.5515354-735-10630007454098/AnsiballZ_lineinfile.py'
Feb 19 19:13:25 compute-0 sudo[168704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:25 compute-0 python3.9[168707]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:25 compute-0 sudo[168704]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:26 compute-0 sudo[168857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oepajanhhjtunqrtlaoizsixvduqbsxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528406.119269-751-163195212137215/AnsiballZ_systemd.py'
Feb 19 19:13:26 compute-0 sudo[168857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:26 compute-0 python3.9[168860]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:13:26 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 19 19:13:26 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 19 19:13:26 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 19 19:13:26 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 19 19:13:26 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 19 19:13:26 compute-0 sudo[168857]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:27 compute-0 sudo[169014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thxdfovtdvcgcedfrwnogbmqjgtjxupf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528407.1838472-767-275254869456650/AnsiballZ_dnf.py'
Feb 19 19:13:27 compute-0 sudo[169014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:27 compute-0 python3.9[169017]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 19 19:13:29 compute-0 systemd[1]: Reloading.
Feb 19 19:13:29 compute-0 systemd-sysv-generator[169053]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:13:29 compute-0 systemd-rc-local-generator[169050]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:13:29 compute-0 systemd[1]: Reloading.
Feb 19 19:13:29 compute-0 systemd-sysv-generator[169096]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:13:29 compute-0 systemd-rc-local-generator[169091]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:13:30 compute-0 systemd-logind[822]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 19 19:13:30 compute-0 systemd-logind[822]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 19 19:13:30 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 19 19:13:30 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 19 19:13:30 compute-0 systemd[1]: Reloading.
Feb 19 19:13:30 compute-0 systemd-rc-local-generator[169192]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:13:30 compute-0 systemd-sysv-generator[169198]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:13:30 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 19 19:13:30 compute-0 sudo[169014]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:31 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 19 19:13:31 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 19 19:13:31 compute-0 systemd[1]: run-rb4b7c492947b44c29706dbda7b13daf6.service: Deactivated successfully.
Feb 19 19:13:31 compute-0 sudo[170510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uodonzmxbgotimbzlmwlqrxmwrluejze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528411.4035423-783-5555452358392/AnsiballZ_systemd_service.py'
Feb 19 19:13:31 compute-0 sudo[170510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:31 compute-0 python3.9[170513]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:13:31 compute-0 iscsid[163939]: iscsid shutting down.
Feb 19 19:13:31 compute-0 systemd[1]: Stopping Open-iSCSI...
Feb 19 19:13:31 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Feb 19 19:13:31 compute-0 systemd[1]: Stopped Open-iSCSI.
Feb 19 19:13:31 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 19 19:13:31 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 19 19:13:31 compute-0 systemd[1]: Started Open-iSCSI.
Feb 19 19:13:31 compute-0 sudo[170510]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:13:32.100 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:13:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:13:32.101 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:13:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:13:32.101 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:13:32 compute-0 sudo[170668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksaorlswuwaodnnhoqtpuajteqorhwsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528412.0708334-799-54416288734539/AnsiballZ_systemd_service.py'
Feb 19 19:13:32 compute-0 sudo[170668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:32 compute-0 python3.9[170671]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:13:32 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Feb 19 19:13:32 compute-0 multipathd[167953]: exit (signal)
Feb 19 19:13:32 compute-0 multipathd[167953]: --------shut down-------
Feb 19 19:13:32 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Feb 19 19:13:32 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Feb 19 19:13:32 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 19 19:13:32 compute-0 multipathd[170678]: --------start up--------
Feb 19 19:13:32 compute-0 multipathd[170678]: read /etc/multipath.conf
Feb 19 19:13:32 compute-0 multipathd[170678]: path checkers start up
Feb 19 19:13:32 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 19 19:13:32 compute-0 sudo[170668]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:33 compute-0 python3.9[170836]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:13:33 compute-0 sudo[170990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgtmuzzmfmrlizdiypumyvkyhsvolzwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528413.7717953-834-78894036561353/AnsiballZ_file.py'
Feb 19 19:13:33 compute-0 sudo[170990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:34 compute-0 python3.9[170993]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:34 compute-0 sudo[170990]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:34 compute-0 sudo[171143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcjqrdhwjyzrytodxzvurlldgyqvlwuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528414.6291306-856-151244250091892/AnsiballZ_systemd_service.py'
Feb 19 19:13:34 compute-0 sudo[171143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:35 compute-0 python3.9[171146]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 19 19:13:35 compute-0 systemd[1]: Reloading.
Feb 19 19:13:35 compute-0 systemd-rc-local-generator[171169]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:13:35 compute-0 systemd-sysv-generator[171172]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:13:35 compute-0 sudo[171143]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:35 compute-0 podman[171189]: 2026-02-19 19:13:35.467263804 +0000 UTC m=+0.043838645 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 19 19:13:35 compute-0 python3.9[171357]: ansible-ansible.builtin.service_facts Invoked
Feb 19 19:13:35 compute-0 network[171374]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 19 19:13:35 compute-0 network[171375]: 'network-scripts' will be removed from distribution in near future.
Feb 19 19:13:35 compute-0 network[171376]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 19 19:13:38 compute-0 sudo[171647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzoozdgizkcdtmedkzyqauykzucnyhrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528418.5284371-894-73050844350663/AnsiballZ_systemd_service.py'
Feb 19 19:13:38 compute-0 sudo[171647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:38 compute-0 python3.9[171650]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:13:39 compute-0 sudo[171647]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:39 compute-0 sudo[171801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rolfvifqownajbejzzxdcecsdaybtybw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528419.1046169-894-31449081715133/AnsiballZ_systemd_service.py'
Feb 19 19:13:39 compute-0 sudo[171801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:39 compute-0 python3.9[171804]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:13:39 compute-0 sudo[171801]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:39 compute-0 sudo[171961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymhyjzcnaljexbjarwlwkyvbduefvlab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528419.6947105-894-71895476725414/AnsiballZ_systemd_service.py'
Feb 19 19:13:39 compute-0 sudo[171961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:39 compute-0 podman[171929]: 2026-02-19 19:13:39.980479223 +0000 UTC m=+0.079090033 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Feb 19 19:13:40 compute-0 python3.9[171971]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:13:40 compute-0 sudo[171961]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:40 compute-0 sudo[172135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdmlpncjzyvdmndkxixggnipjelkskxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528420.3364928-894-157559468822095/AnsiballZ_systemd_service.py'
Feb 19 19:13:40 compute-0 sudo[172135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:40 compute-0 python3.9[172138]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:13:40 compute-0 sudo[172135]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:41 compute-0 sudo[172289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akwumgyjubgmgcjcxgppprxjtvzzubtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528420.935682-894-35183285201221/AnsiballZ_systemd_service.py'
Feb 19 19:13:41 compute-0 sudo[172289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:41 compute-0 python3.9[172292]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:13:41 compute-0 sudo[172289]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:41 compute-0 sudo[172443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haxwofrqmxkdpymlxnafrqxorcuxnajm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528421.5447314-894-145317392312318/AnsiballZ_systemd_service.py'
Feb 19 19:13:41 compute-0 sudo[172443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:42 compute-0 python3.9[172446]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:13:42 compute-0 sudo[172443]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:42 compute-0 sudo[172597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjgadwcdylamwfpnmoqiebsyavwwzlrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528422.160262-894-178276997760016/AnsiballZ_systemd_service.py'
Feb 19 19:13:42 compute-0 sudo[172597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:42 compute-0 python3.9[172600]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:13:42 compute-0 sudo[172597]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:42 compute-0 sudo[172751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdwrprrairzfvlxonoobtoqyiwgivust ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528422.753299-894-125953632699681/AnsiballZ_systemd_service.py'
Feb 19 19:13:42 compute-0 sudo[172751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:43 compute-0 python3.9[172754]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:13:43 compute-0 sudo[172751]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:44 compute-0 sudo[172905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebblgaezashaclikfdqjypxjvagxuyaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528424.1936646-1012-228464282751681/AnsiballZ_file.py'
Feb 19 19:13:44 compute-0 sudo[172905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:44 compute-0 python3.9[172908]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:44 compute-0 sudo[172905]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:44 compute-0 sudo[173058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynrhshzzdssbymahykbmfklwzfmvtdyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528424.706676-1012-9742769322179/AnsiballZ_file.py'
Feb 19 19:13:44 compute-0 sudo[173058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:45 compute-0 python3.9[173061]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:45 compute-0 sudo[173058]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:45 compute-0 sudo[173211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkqsgcfjfcqunnuxsvgrblhbsxwcmcwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528425.1852055-1012-231977304687992/AnsiballZ_file.py'
Feb 19 19:13:45 compute-0 sudo[173211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:45 compute-0 python3.9[173214]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:45 compute-0 sudo[173211]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:45 compute-0 sudo[173364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aofooskdbpfycaiidofphoecrbpclatq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528425.6796567-1012-244240256615667/AnsiballZ_file.py'
Feb 19 19:13:45 compute-0 sudo[173364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:46 compute-0 python3.9[173367]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:46 compute-0 sudo[173364]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:46 compute-0 sudo[173517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftxupkdjtcfgmerhfzfozteoinrurvsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528426.1114614-1012-93356108435318/AnsiballZ_file.py'
Feb 19 19:13:46 compute-0 sudo[173517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:46 compute-0 python3.9[173520]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:46 compute-0 sudo[173517]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:46 compute-0 sudo[173670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjkmjkerdvadgjegpfxmdrthzykjyewx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528426.721171-1012-1617859537799/AnsiballZ_file.py'
Feb 19 19:13:46 compute-0 sudo[173670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:47 compute-0 python3.9[173673]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:47 compute-0 sudo[173670]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:47 compute-0 sudo[173823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwyvzubougwfzwizvtdrkzhglcindwwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528427.2214363-1012-271214691181295/AnsiballZ_file.py'
Feb 19 19:13:47 compute-0 sudo[173823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:47 compute-0 python3.9[173826]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:47 compute-0 sudo[173823]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:47 compute-0 sudo[173976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kawvljobogocorrqdbabhilqfywtnsjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528427.6869445-1012-93899655519216/AnsiballZ_file.py'
Feb 19 19:13:47 compute-0 sudo[173976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:48 compute-0 python3.9[173979]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:48 compute-0 sudo[173976]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:49 compute-0 sudo[174129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lawrixzvsggdkeymywswpllkadhybhpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528428.8621607-1126-200181112588041/AnsiballZ_file.py'
Feb 19 19:13:49 compute-0 sudo[174129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:49 compute-0 python3.9[174132]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:49 compute-0 sudo[174129]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:49 compute-0 sudo[174282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utihlfiidmaprgrpsinqnliyukekcnyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528429.4518416-1126-22606778299548/AnsiballZ_file.py'
Feb 19 19:13:49 compute-0 sudo[174282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:49 compute-0 python3.9[174285]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:49 compute-0 sudo[174282]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:50 compute-0 sudo[174435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwavlrvslpdddxxbklphdzeixdkoaloj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528429.930441-1126-3746456526448/AnsiballZ_file.py'
Feb 19 19:13:50 compute-0 sudo[174435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:50 compute-0 python3.9[174438]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:50 compute-0 sudo[174435]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:50 compute-0 sudo[174588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yosdddhrykjhdkbryembmixzeacuzwet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528430.4171221-1126-271119939025985/AnsiballZ_file.py'
Feb 19 19:13:50 compute-0 sudo[174588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:50 compute-0 python3.9[174591]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:50 compute-0 sudo[174588]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:51 compute-0 sudo[174741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deovtvipedvxwbiejpdzqzdmpsbpehzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528431.0205352-1126-279602083359239/AnsiballZ_file.py'
Feb 19 19:13:51 compute-0 sudo[174741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:51 compute-0 python3.9[174744]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:51 compute-0 sudo[174741]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:51 compute-0 sudo[174894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwunjuypqepwmgejssglrsoutcqlpfaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528431.5237505-1126-63916480535246/AnsiballZ_file.py'
Feb 19 19:13:51 compute-0 sudo[174894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:51 compute-0 python3.9[174897]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:51 compute-0 sudo[174894]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:52 compute-0 sudo[175047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tywefwjlrhvxmavraozcizxxkmcbebpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528432.1266038-1126-190921836707689/AnsiballZ_file.py'
Feb 19 19:13:52 compute-0 sudo[175047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:52 compute-0 python3.9[175050]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:52 compute-0 sudo[175047]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:53 compute-0 sudo[175200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhtqdqvdmglrsctghgqmqmmigcqwhcua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528432.773591-1126-55927556687484/AnsiballZ_file.py'
Feb 19 19:13:53 compute-0 sudo[175200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:53 compute-0 python3.9[175203]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:13:53 compute-0 sudo[175200]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:54 compute-0 sudo[175353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqyktvzeydsygfxzudckzxunafoalllz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528433.87689-1242-96596788324409/AnsiballZ_command.py'
Feb 19 19:13:54 compute-0 sudo[175353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:54 compute-0 python3.9[175356]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:13:54 compute-0 sudo[175353]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:55 compute-0 python3.9[175508]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 19 19:13:55 compute-0 sudo[175658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cerivocegycsjtitnommnhqllhsdonat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528435.5065067-1278-94385024304152/AnsiballZ_systemd_service.py'
Feb 19 19:13:55 compute-0 sudo[175658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:56 compute-0 python3.9[175661]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 19 19:13:56 compute-0 systemd[1]: Reloading.
Feb 19 19:13:56 compute-0 systemd-rc-local-generator[175680]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:13:56 compute-0 systemd-sysv-generator[175686]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:13:56 compute-0 sudo[175658]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:56 compute-0 sudo[175852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkspdaelokidpbmupvtlmqwsnrdsnqqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528436.4910643-1294-75596251182773/AnsiballZ_command.py'
Feb 19 19:13:56 compute-0 sudo[175852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:56 compute-0 python3.9[175855]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:13:56 compute-0 sudo[175852]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:57 compute-0 sudo[176006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agoduxakqpmsojgnakshbcsmvbcigrsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528436.9628274-1294-241057879732676/AnsiballZ_command.py'
Feb 19 19:13:57 compute-0 sudo[176006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:57 compute-0 python3.9[176009]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:13:57 compute-0 sudo[176006]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:57 compute-0 sudo[176160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdtjrkxqbgkytqjdgryunshyyaghuktb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528437.4806042-1294-240792407912986/AnsiballZ_command.py'
Feb 19 19:13:57 compute-0 sudo[176160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:57 compute-0 python3.9[176163]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:13:57 compute-0 sudo[176160]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:58 compute-0 sudo[176314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iewidxhcrqkeqwwywodxruxssbydrogz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528437.9961731-1294-260600914858295/AnsiballZ_command.py'
Feb 19 19:13:58 compute-0 sudo[176314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:58 compute-0 python3.9[176317]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:13:58 compute-0 sudo[176314]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:58 compute-0 sudo[176468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vthmorcirabjlmbkeoacxhqnuquvxvde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528438.5372639-1294-250380243958775/AnsiballZ_command.py'
Feb 19 19:13:58 compute-0 sudo[176468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:58 compute-0 python3.9[176471]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:13:58 compute-0 sudo[176468]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:59 compute-0 sudo[176622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhnvealdxvwtdazitrsrqdwwctucueci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528439.074973-1294-205242654095076/AnsiballZ_command.py'
Feb 19 19:13:59 compute-0 sudo[176622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:59 compute-0 python3.9[176625]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:13:59 compute-0 sudo[176622]: pam_unix(sudo:session): session closed for user root
Feb 19 19:13:59 compute-0 sudo[176776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqlbcfzflcbimjiiwxbzsvipvscudpbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528439.5681403-1294-169097926472598/AnsiballZ_command.py'
Feb 19 19:13:59 compute-0 sudo[176776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:13:59 compute-0 python3.9[176779]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:13:59 compute-0 sudo[176776]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:00 compute-0 sudo[176930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boqmhgiryrdmwcgurexrybajshbgvzrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528440.081148-1294-267414786964939/AnsiballZ_command.py'
Feb 19 19:14:00 compute-0 sudo[176930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:00 compute-0 python3.9[176933]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:14:00 compute-0 sudo[176930]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:01 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Feb 19 19:14:02 compute-0 sudo[177085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awebyequdyjtnrlraxmektdrwopbnfsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528442.3954616-1437-224670378378372/AnsiballZ_file.py'
Feb 19 19:14:02 compute-0 sudo[177085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:02 compute-0 python3.9[177088]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:02 compute-0 sudo[177085]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:02 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 19 19:14:03 compute-0 sudo[177239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuzsedpqvynacbdlxachyeercnhfyato ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528443.0905452-1437-271117266813463/AnsiballZ_file.py'
Feb 19 19:14:03 compute-0 sudo[177239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:03 compute-0 python3.9[177242]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:03 compute-0 sudo[177239]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:03 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Feb 19 19:14:03 compute-0 sudo[177393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idjzsnryihdccbkbsmcgobpsmrfxmmii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528443.7318425-1467-69924334910198/AnsiballZ_file.py'
Feb 19 19:14:03 compute-0 sudo[177393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:04 compute-0 python3.9[177396]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:04 compute-0 sudo[177393]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:04 compute-0 sudo[177546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovegkxpdvdwhwdpbbgleejgzylflvboe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528444.3317428-1467-280569173573550/AnsiballZ_file.py'
Feb 19 19:14:04 compute-0 sudo[177546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:04 compute-0 python3.9[177549]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:04 compute-0 sudo[177546]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:04 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 19 19:14:05 compute-0 sudo[177700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzwmehzbjvczpxvzsbhuawllbscrasqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528444.8867311-1467-5992691467844/AnsiballZ_file.py'
Feb 19 19:14:05 compute-0 sudo[177700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:05 compute-0 python3.9[177703]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:05 compute-0 sudo[177700]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:05 compute-0 sudo[177869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yugzevjapanusljgkzfbmhysramjhrra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528445.4904046-1467-228231629633337/AnsiballZ_file.py'
Feb 19 19:14:05 compute-0 sudo[177869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:05 compute-0 podman[177827]: 2026-02-19 19:14:05.782811484 +0000 UTC m=+0.058430752 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 19 19:14:05 compute-0 python3.9[177876]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:05 compute-0 sudo[177869]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:06 compute-0 sudo[178026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frmbuvohetdbkfjlluokbhwsbgbivwam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528446.1043038-1467-139907743165628/AnsiballZ_file.py'
Feb 19 19:14:06 compute-0 sudo[178026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:06 compute-0 python3.9[178029]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:06 compute-0 sudo[178026]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:06 compute-0 sudo[178179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inolmbsebdhfznrgkprgwzygblfwkgwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528446.7131028-1467-116076293942637/AnsiballZ_file.py'
Feb 19 19:14:06 compute-0 sudo[178179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:07 compute-0 python3.9[178182]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:07 compute-0 sudo[178179]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:07 compute-0 sudo[178332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saausvfrkzoxwgqxviqmvhksxqlgjnha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528447.3180804-1467-32874021861551/AnsiballZ_file.py'
Feb 19 19:14:07 compute-0 sudo[178332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:07 compute-0 python3.9[178335]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:07 compute-0 sudo[178332]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:10 compute-0 podman[178360]: 2026-02-19 19:14:10.301432905 +0000 UTC m=+0.081302948 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 19 19:14:13 compute-0 sudo[178513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmmzkbwkmheoygvizvzoxxsvqqixjnop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528453.4269128-1744-201622563180958/AnsiballZ_getent.py'
Feb 19 19:14:13 compute-0 sudo[178513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:13 compute-0 python3.9[178516]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 19 19:14:14 compute-0 sudo[178513]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:14 compute-0 sudo[178667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxjbvwakmcctetmjtuywrsgbmrsyzptr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528454.2115293-1760-8934827278263/AnsiballZ_group.py'
Feb 19 19:14:14 compute-0 sudo[178667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:14 compute-0 python3.9[178670]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 19 19:14:14 compute-0 groupadd[178671]: group added to /etc/group: name=nova, GID=42436
Feb 19 19:14:14 compute-0 groupadd[178671]: group added to /etc/gshadow: name=nova
Feb 19 19:14:14 compute-0 groupadd[178671]: new group: name=nova, GID=42436
Feb 19 19:14:14 compute-0 sudo[178667]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:15 compute-0 sudo[178826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isfjkzgjzertihximebgtwrqbxycpwkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528455.0509605-1776-95151277796458/AnsiballZ_user.py'
Feb 19 19:14:15 compute-0 sudo[178826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:15 compute-0 python3.9[178829]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 19 19:14:15 compute-0 useradd[178831]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/1
Feb 19 19:14:15 compute-0 useradd[178831]: add 'nova' to group 'libvirt'
Feb 19 19:14:15 compute-0 useradd[178831]: add 'nova' to shadow group 'libvirt'
Feb 19 19:14:15 compute-0 sudo[178826]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:16 compute-0 sshd-session[178862]: Accepted publickey for zuul from 192.168.122.30 port 36448 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:14:16 compute-0 systemd-logind[822]: New session 25 of user zuul.
Feb 19 19:14:16 compute-0 systemd[1]: Started Session 25 of User zuul.
Feb 19 19:14:16 compute-0 sshd-session[178862]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:14:16 compute-0 sshd-session[178865]: Received disconnect from 192.168.122.30 port 36448:11: disconnected by user
Feb 19 19:14:16 compute-0 sshd-session[178865]: Disconnected from user zuul 192.168.122.30 port 36448
Feb 19 19:14:16 compute-0 sshd-session[178862]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:14:16 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Feb 19 19:14:16 compute-0 systemd-logind[822]: Session 25 logged out. Waiting for processes to exit.
Feb 19 19:14:16 compute-0 systemd-logind[822]: Removed session 25.
Feb 19 19:14:17 compute-0 python3.9[179015]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:14:17 compute-0 python3.9[179091]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:18 compute-0 python3.9[179241]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:14:18 compute-0 python3.9[179362]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528457.8686745-1826-25905584373520/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:19 compute-0 python3.9[179512]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:14:19 compute-0 python3.9[179633]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528458.8736038-1826-130746335090667/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:20 compute-0 python3.9[179783]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:14:20 compute-0 python3.9[179904]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528459.791146-1826-56736045004317/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:21 compute-0 python3.9[180054]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:14:21 compute-0 python3.9[180175]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528461.0331912-1934-235315962937676/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:22 compute-0 sudo[180325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skevktmzbocxrflfkosvhuiwzswpuyaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528462.0895512-1964-12027451832873/AnsiballZ_file.py'
Feb 19 19:14:22 compute-0 sudo[180325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:22 compute-0 python3.9[180328]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:14:22 compute-0 sudo[180325]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:22 compute-0 sudo[180478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlglpxgssaqdgyhrpilzxjisbkynxwki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528462.7318137-1980-214239263216014/AnsiballZ_copy.py'
Feb 19 19:14:22 compute-0 sudo[180478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:23 compute-0 python3.9[180481]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:14:23 compute-0 sudo[180478]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:23 compute-0 sudo[180631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxqtwmlosdmzjjzueetxljfwdwklpczh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528463.4161499-1996-175011729450984/AnsiballZ_stat.py'
Feb 19 19:14:23 compute-0 sudo[180631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:23 compute-0 python3.9[180634]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:14:23 compute-0 sudo[180631]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:24 compute-0 sudo[180784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fptpfglvnomgwpytcaewaajmhmologip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528464.0566251-2012-96363801085642/AnsiballZ_stat.py'
Feb 19 19:14:24 compute-0 sudo[180784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:24 compute-0 python3.9[180787]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:14:24 compute-0 sudo[180784]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:24 compute-0 sudo[180908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfwmcnjdsyhdrieneaptnwrkmxjjdcyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528464.0566251-2012-96363801085642/AnsiballZ_copy.py'
Feb 19 19:14:24 compute-0 sudo[180908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:24 compute-0 python3.9[180911]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1771528464.0566251-2012-96363801085642/.source _original_basename=.h6dz9qoo follow=False checksum=71e659fac0e017ad46704b1d552260f6b3562ed3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Feb 19 19:14:24 compute-0 sudo[180908]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:25 compute-0 python3.9[181063]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:14:26 compute-0 sshd-session[181064]: Invalid user n8n from 138.255.157.62 port 51531
Feb 19 19:14:26 compute-0 sudo[181217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hogezbtqqznkzqqorbqkialwoukavenv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528466.177681-2068-15870593174011/AnsiballZ_file.py'
Feb 19 19:14:26 compute-0 sudo[181217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:26 compute-0 sshd-session[181064]: Received disconnect from 138.255.157.62 port 51531:11: Bye Bye [preauth]
Feb 19 19:14:26 compute-0 sshd-session[181064]: Disconnected from invalid user n8n 138.255.157.62 port 51531 [preauth]
Feb 19 19:14:26 compute-0 python3.9[181220]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:14:26 compute-0 sudo[181217]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:27 compute-0 sudo[181370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncoatkfeiyyhiucsijctdhoyuiwjdzar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528466.8445973-2084-42231658820519/AnsiballZ_file.py'
Feb 19 19:14:27 compute-0 sudo[181370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:27 compute-0 python3.9[181373]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:27 compute-0 sudo[181370]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:27 compute-0 python3.9[181523]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:14:29 compute-0 sudo[181944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmraqdlcyggjzcefccwdjpatgygeskex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528469.5872166-2152-239031061464894/AnsiballZ_container_config_data.py'
Feb 19 19:14:29 compute-0 sudo[181944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:30 compute-0 python3.9[181947]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False
Feb 19 19:14:30 compute-0 sudo[181944]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:30 compute-0 sudo[182097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrjcqpyxzskunumzwcvgtchukjcbmukg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528470.4662855-2174-103552317588098/AnsiballZ_container_config_hash.py'
Feb 19 19:14:30 compute-0 sudo[182097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:30 compute-0 python3.9[182100]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 19 19:14:31 compute-0 sudo[182097]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:31 compute-0 sudo[182250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhueljbufpgizmtwiecfbvwwlojxtsgm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771528471.3982506-2194-109391434110061/AnsiballZ_edpm_container_manage.py'
Feb 19 19:14:31 compute-0 sudo[182250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:32 compute-0 python3[182253]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False
Feb 19 19:14:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:14:32.101 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:14:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:14:32.102 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:14:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:14:32.102 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:14:32 compute-0 podman[182291]: 2026-02-19 19:14:32.147637235 +0000 UTC m=+0.047500775 container create 6f7f6b29aa5c747c376dd46dc00fbc27e3bc82fd22a26f9c73cffe17e8c7ac8c (image=38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, container_name=nova_compute_init, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=nova_compute_init)
Feb 19 19:14:32 compute-0 podman[182291]: 2026-02-19 19:14:32.120575191 +0000 UTC m=+0.020438751 image pull a2cf355e1328741433ea45a579c41828962820431e14bc44b297a8d036ff250d 38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Feb 19 19:14:32 compute-0 python3[182253]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720 --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 19 19:14:32 compute-0 sudo[182250]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:32 compute-0 sudo[182480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewnlbfpfvailqvhlkqxgqwwxhqrggxec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528472.7090116-2210-107216931928022/AnsiballZ_stat.py'
Feb 19 19:14:32 compute-0 sudo[182480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:33 compute-0 python3.9[182483]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:14:33 compute-0 sudo[182480]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:34 compute-0 python3.9[182635]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 19 19:14:35 compute-0 sudo[182785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcaxcanohivwnlpinkuicpttaxqyobrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528474.7798884-2264-86156206920633/AnsiballZ_stat.py'
Feb 19 19:14:35 compute-0 sudo[182785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:35 compute-0 python3.9[182788]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:14:35 compute-0 sudo[182785]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:35 compute-0 sudo[182911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcefqndkxsrqjoujvfdhzstvvbxxqkik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528474.7798884-2264-86156206920633/AnsiballZ_copy.py'
Feb 19 19:14:35 compute-0 sudo[182911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:35 compute-0 python3.9[182914]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528474.7798884-2264-86156206920633/.source.yaml _original_basename=._kx33m7q follow=False checksum=13c02b7e00075f73f1f0a2607b21d93533ab7198 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:14:35 compute-0 sudo[182911]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:35 compute-0 podman[182915]: 2026-02-19 19:14:35.885338103 +0000 UTC m=+0.054007277 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 19 19:14:36 compute-0 sudo[183084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdsbejobkronokheyoumifgmdthxxiwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528476.221528-2298-19285841017950/AnsiballZ_file.py'
Feb 19 19:14:36 compute-0 sudo[183084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:36 compute-0 python3.9[183087]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:14:36 compute-0 sudo[183084]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:37 compute-0 sudo[183237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-madwrjlykivmasqdxrkqdxrzlfklypkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528476.8281825-2314-22183398768638/AnsiballZ_file.py'
Feb 19 19:14:37 compute-0 sudo[183237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:37 compute-0 python3.9[183240]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:14:37 compute-0 sudo[183237]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:37 compute-0 sudo[183390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bomjnyfutkgqyjkirjusgweasolepebq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528477.5528274-2330-61931411050151/AnsiballZ_stat.py'
Feb 19 19:14:37 compute-0 sudo[183390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:37 compute-0 python3.9[183393]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:14:38 compute-0 sudo[183390]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:38 compute-0 sudo[183514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihwatvymvvqsgveqgpvouigbkwvpmjav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528477.5528274-2330-61931411050151/AnsiballZ_copy.py'
Feb 19 19:14:38 compute-0 sudo[183514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:38 compute-0 python3.9[183517]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/nova_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528477.5528274-2330-61931411050151/.source.json _original_basename=.wu1txuq1 follow=False checksum=0018389a48392615f4a8869cad43008a907328ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:14:38 compute-0 sudo[183514]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:39 compute-0 python3.9[183667]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:14:40 compute-0 sudo[184105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeoqurrgvywzzjqrigxqoucjifzqekow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528480.6107168-2410-55911420764540/AnsiballZ_container_config_data.py'
Feb 19 19:14:40 compute-0 sudo[184105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:40 compute-0 podman[184062]: 2026-02-19 19:14:40.911278928 +0000 UTC m=+0.114491213 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:14:41 compute-0 python3.9[184114]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False
Feb 19 19:14:41 compute-0 sudo[184105]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:41 compute-0 sudo[184267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjpimqnsjpmbnvcrillmdukzcyqgvomf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528481.4027398-2432-230841418655491/AnsiballZ_container_config_hash.py'
Feb 19 19:14:41 compute-0 sudo[184267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:41 compute-0 python3.9[184270]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 19 19:14:41 compute-0 sudo[184267]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:42 compute-0 sudo[184420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbjsfhqdycembtgctlqhpirmqkwfxgyx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771528482.1462567-2452-240040438023033/AnsiballZ_edpm_container_manage.py'
Feb 19 19:14:42 compute-0 sudo[184420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:42 compute-0 python3[184423]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 19 19:14:42 compute-0 podman[184459]: 2026-02-19 19:14:42.712536672 +0000 UTC m=+0.071238609 container create faf00e9823aab32743a3c5e8285c40c46929d8e72d00f7dbe49efb141f3bd600 (image=38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.43.0, config_id=nova_compute, container_name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:14:42 compute-0 podman[184459]: 2026-02-19 19:14:42.660056659 +0000 UTC m=+0.018758626 image pull a2cf355e1328741433ea45a579c41828962820431e14bc44b297a8d036ff250d 38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Feb 19 19:14:42 compute-0 python3[184423]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720 --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest kolla_start
Feb 19 19:14:42 compute-0 sudo[184420]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:43 compute-0 sudo[184647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmbtpehztbgdzhfvcygdfrnvbmbwkprs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528483.0178583-2468-172847475791953/AnsiballZ_stat.py'
Feb 19 19:14:43 compute-0 sudo[184647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:43 compute-0 python3.9[184650]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:14:43 compute-0 sudo[184647]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:43 compute-0 sudo[184802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjzddydpocumcejbozweovnzxypizmxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528483.7186813-2486-59876434310992/AnsiballZ_file.py'
Feb 19 19:14:43 compute-0 sudo[184802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:44 compute-0 python3.9[184805]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:14:44 compute-0 sudo[184802]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:44 compute-0 sudo[184879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogspvmldffrbortjsuocgmujrouegnfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528483.7186813-2486-59876434310992/AnsiballZ_stat.py'
Feb 19 19:14:44 compute-0 sudo[184879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:44 compute-0 python3.9[184882]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:14:44 compute-0 sudo[184879]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:44 compute-0 sudo[185031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcuwpkopmvyjcagztiglnuhvsiwezjrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528484.5707428-2486-116758136130652/AnsiballZ_copy.py'
Feb 19 19:14:44 compute-0 sudo[185031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:45 compute-0 python3.9[185034]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771528484.5707428-2486-116758136130652/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:14:45 compute-0 sudo[185031]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:45 compute-0 sudo[185108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obflfavetkuqkjacoxungtlvhhwydpvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528484.5707428-2486-116758136130652/AnsiballZ_systemd.py'
Feb 19 19:14:45 compute-0 sudo[185108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:45 compute-0 python3.9[185111]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 19 19:14:45 compute-0 systemd[1]: Reloading.
Feb 19 19:14:45 compute-0 systemd-rc-local-generator[185136]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:14:45 compute-0 systemd-sysv-generator[185142]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:14:46 compute-0 sudo[185108]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:46 compute-0 sudo[185228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhxqtqqigntsnigngtuvhavpohwmskmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528484.5707428-2486-116758136130652/AnsiballZ_systemd.py'
Feb 19 19:14:46 compute-0 sudo[185228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:46 compute-0 python3.9[185231]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:14:46 compute-0 systemd[1]: Reloading.
Feb 19 19:14:46 compute-0 systemd-rc-local-generator[185258]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:14:46 compute-0 systemd-sysv-generator[185261]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:14:46 compute-0 systemd[1]: Starting nova_compute container...
Feb 19 19:14:46 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:14:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aff0712aec6d7b9f713f644b07baef62f58430fe3526737610d4561e67a69608/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 19 19:14:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aff0712aec6d7b9f713f644b07baef62f58430fe3526737610d4561e67a69608/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 19 19:14:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aff0712aec6d7b9f713f644b07baef62f58430fe3526737610d4561e67a69608/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 19 19:14:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aff0712aec6d7b9f713f644b07baef62f58430fe3526737610d4561e67a69608/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 19 19:14:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aff0712aec6d7b9f713f644b07baef62f58430fe3526737610d4561e67a69608/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 19 19:14:46 compute-0 podman[185277]: 2026-02-19 19:14:46.938023942 +0000 UTC m=+0.096274520 container init faf00e9823aab32743a3c5e8285c40c46929d8e72d00f7dbe49efb141f3bd600 (image=38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=nova_compute)
Feb 19 19:14:46 compute-0 podman[185277]: 2026-02-19 19:14:46.942214003 +0000 UTC m=+0.100464571 container start faf00e9823aab32743a3c5e8285c40c46929d8e72d00f7dbe49efb141f3bd600 (image=38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=nova_compute, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.build-date=20260216)
Feb 19 19:14:46 compute-0 nova_compute[185292]: + sudo -E kolla_set_configs
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Validating config file
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Copying service configuration files
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Deleting /etc/ceph
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Creating directory /etc/ceph
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Setting permission for /etc/ceph
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Writing out command to execute
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 19 19:14:47 compute-0 nova_compute[185292]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 19 19:14:47 compute-0 nova_compute[185292]: ++ cat /run_command
Feb 19 19:14:47 compute-0 nova_compute[185292]: + CMD=nova-compute
Feb 19 19:14:47 compute-0 nova_compute[185292]: + ARGS=
Feb 19 19:14:47 compute-0 nova_compute[185292]: + sudo kolla_copy_cacerts
Feb 19 19:14:47 compute-0 nova_compute[185292]: + [[ ! -n '' ]]
Feb 19 19:14:47 compute-0 nova_compute[185292]: + . kolla_extend_start
Feb 19 19:14:47 compute-0 nova_compute[185292]: Running command: 'nova-compute'
Feb 19 19:14:47 compute-0 nova_compute[185292]: + echo 'Running command: '\''nova-compute'\'''
Feb 19 19:14:47 compute-0 nova_compute[185292]: + umask 0022
Feb 19 19:14:47 compute-0 nova_compute[185292]: + exec nova-compute
Feb 19 19:14:47 compute-0 podman[185277]: nova_compute
Feb 19 19:14:47 compute-0 systemd[1]: Started nova_compute container.
Feb 19 19:14:47 compute-0 sudo[185228]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:47 compute-0 python3.9[185453]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 19 19:14:48 compute-0 sudo[185603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvzmjxlorduipdaujmurohvqpkohtxzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528488.316132-2576-220782246570156/AnsiballZ_stat.py'
Feb 19 19:14:48 compute-0 sudo[185603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:48 compute-0 python3.9[185606]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:14:48 compute-0 sudo[185603]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:48 compute-0 nova_compute[185292]: 2026-02-19 19:14:48.804 185296 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Feb 19 19:14:48 compute-0 nova_compute[185292]: 2026-02-19 19:14:48.804 185296 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Feb 19 19:14:48 compute-0 nova_compute[185292]: 2026-02-19 19:14:48.805 185296 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Feb 19 19:14:48 compute-0 nova_compute[185292]: 2026-02-19 19:14:48.805 185296 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 19 19:14:48 compute-0 nova_compute[185292]: 2026-02-19 19:14:48.917 185296 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:14:48 compute-0 nova_compute[185292]: 2026-02-19 19:14:48.927 185296 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:14:48 compute-0 nova_compute[185292]: 2026-02-19 19:14:48.927 185296 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Feb 19 19:14:48 compute-0 nova_compute[185292]: 2026-02-19 19:14:48.954 185296 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Feb 19 19:14:48 compute-0 nova_compute[185292]: 2026-02-19 19:14:48.955 185296 WARNING oslo_config.cfg [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Feb 19 19:14:49 compute-0 sudo[185732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wobotscxmnjfnryklaahgxhuwfudxrot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528488.316132-2576-220782246570156/AnsiballZ_copy.py'
Feb 19 19:14:49 compute-0 sudo[185732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:49 compute-0 python3.9[185735]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528488.316132-2576-220782246570156/.source.yaml _original_basename=.6msxew2g follow=False checksum=91768eabf1db43942bc049ec91f875a96ac43844 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:14:49 compute-0 sudo[185732]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:49 compute-0 nova_compute[185292]: 2026-02-19 19:14:49.983 185296 INFO nova.virt.driver [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.061 185296 INFO nova.compute.provider_config [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 19 19:14:50 compute-0 python3.9[185885]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.567 185296 DEBUG oslo_concurrency.lockutils [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.567 185296 DEBUG oslo_concurrency.lockutils [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.567 185296 DEBUG oslo_concurrency.lockutils [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.568 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.568 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.568 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.568 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.568 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.568 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.569 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.569 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.569 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.569 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.569 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.569 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.569 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.569 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.569 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.570 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.570 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.570 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.570 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.570 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.570 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.570 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.570 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.571 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.571 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.571 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.571 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.571 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.571 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.571 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.571 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.572 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.572 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.572 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.572 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.572 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.572 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.572 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.572 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.572 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.573 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.573 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.573 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.573 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.573 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.573 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.573 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.573 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.574 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.574 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.574 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.574 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.574 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.574 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.574 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.574 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.575 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.575 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.575 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.575 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.575 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.575 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.575 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.575 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.576 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.576 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.576 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.576 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.576 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.576 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.576 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.576 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.577 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.577 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.577 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.577 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.577 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.577 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.577 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.577 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.577 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.578 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.578 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.578 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.578 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.578 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.578 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.578 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.578 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.579 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.579 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.579 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.579 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.579 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.579 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.579 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.579 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.579 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.579 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.580 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.580 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.580 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.580 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.580 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.580 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.580 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.580 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.581 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.581 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.581 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.581 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.581 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.581 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.581 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.581 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.582 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.582 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.582 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.582 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.582 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.582 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.582 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.582 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.583 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.583 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.583 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.583 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.583 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.583 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.583 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.583 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.583 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.584 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.584 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.584 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.584 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.584 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.584 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.584 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.584 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.585 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.585 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.585 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.585 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.585 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.585 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.585 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.585 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.586 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.586 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.586 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.586 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.586 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.586 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.586 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.586 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.587 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.587 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.587 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.587 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.587 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.587 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.587 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.587 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.588 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.588 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.588 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.588 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.588 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.588 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.588 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.588 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.589 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.589 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.589 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.589 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.589 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.589 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.589 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.589 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.590 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.590 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.590 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.590 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.590 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.590 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.590 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.591 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.591 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.591 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.591 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.591 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.591 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.591 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.591 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.591 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.592 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.592 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.592 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.592 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.592 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.592 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.592 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.592 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.593 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.593 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.593 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.593 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.593 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.593 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.593 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.593 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.593 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.594 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.594 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.594 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.594 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.594 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.594 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.594 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.594 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.595 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.595 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.595 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.595 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.595 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.595 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.595 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.595 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.595 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.596 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.596 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.596 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.596 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.596 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.596 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.596 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.596 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.596 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.597 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.597 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.597 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.597 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.597 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.597 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.597 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.597 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.598 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.598 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.598 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.598 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.598 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.598 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.598 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.599 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.599 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.599 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.599 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.599 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.599 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.599 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.599 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.600 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.600 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.600 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.600 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.600 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.600 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.600 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.600 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.601 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.601 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.601 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.601 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.601 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.601 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.601 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.602 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.602 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.602 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.602 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.602 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.602 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.603 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.603 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.603 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.603 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.603 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.603 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.604 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.604 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.604 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.604 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.604 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.604 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.605 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.605 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.605 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.605 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.605 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.605 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.606 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.606 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.606 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.606 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.606 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.606 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.606 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.607 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.607 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.607 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.607 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.607 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.607 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.608 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.608 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.608 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.608 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.608 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.608 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.609 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.609 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.609 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.609 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.609 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.609 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.610 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.610 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.610 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.610 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.610 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.611 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.611 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.611 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.611 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.611 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.611 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.612 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.612 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.612 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.612 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.612 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.612 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.612 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.612 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.613 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.613 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.613 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.613 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.613 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.613 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.613 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.613 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.614 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.614 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.614 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.614 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.614 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.614 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.614 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.615 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.615 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.615 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.615 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.615 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.615 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.615 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.615 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.616 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.616 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.616 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.616 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.616 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.616 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.616 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.616 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.617 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.617 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.617 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.617 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.617 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.617 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.617 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.618 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.618 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.618 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.618 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.618 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.618 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.618 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.618 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.618 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.619 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.619 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.619 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.619 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.619 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.619 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.619 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.619 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.620 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.620 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.620 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.620 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.620 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.620 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.620 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.620 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.621 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.621 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.621 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.621 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.621 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.621 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.621 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.621 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.622 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.622 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.622 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.622 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.622 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.622 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.622 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.622 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.623 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.623 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.623 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.623 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.623 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.623 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.623 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.623 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.624 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.624 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.624 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.624 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.624 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.624 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.624 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.624 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.624 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.625 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.625 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.625 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.625 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.625 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.625 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.625 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.625 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.626 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.626 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.626 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.626 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.626 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.626 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.626 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.626 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.627 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.627 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.627 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.627 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.627 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.627 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.627 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.628 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.628 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.628 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.628 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.628 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.628 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.628 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.628 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.629 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.629 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.629 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.629 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.629 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.629 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.629 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.629 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.630 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.630 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.630 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.630 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.630 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.630 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.630 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.630 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.631 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.631 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.631 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.631 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.631 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.631 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.631 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.631 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.631 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.632 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.632 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.632 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.632 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.632 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.632 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.632 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.632 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.633 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.633 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.633 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.633 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.633 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.633 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.633 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.633 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.634 185296 WARNING oslo_config.cfg [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 19 19:14:50 compute-0 nova_compute[185292]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 19 19:14:50 compute-0 nova_compute[185292]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 19 19:14:50 compute-0 nova_compute[185292]: and ``live_migration_inbound_addr`` respectively.
Feb 19 19:14:50 compute-0 nova_compute[185292]: ).  Its value may be silently ignored in the future.
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.634 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.634 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.634 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.634 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.634 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.635 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.635 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.635 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.635 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.635 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.635 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.635 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.635 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.636 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.636 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.636 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.636 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.636 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.636 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.636 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.636 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.636 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.637 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.637 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.637 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.637 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.637 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.637 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.637 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.637 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.638 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.638 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.638 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.638 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.638 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.638 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.638 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.639 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.639 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.639 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.639 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.639 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.639 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.639 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.639 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.639 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.640 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.640 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.640 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.640 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.640 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.640 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.640 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.640 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.641 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.641 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.641 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.641 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.641 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.641 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.641 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.641 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.642 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.642 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.642 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.642 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.642 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.642 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.643 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.643 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.643 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.643 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.643 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.643 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.643 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.643 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.644 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.644 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.644 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.644 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.644 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.644 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.644 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.644 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.645 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.645 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.645 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.645 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.645 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.645 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.645 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.646 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.646 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.646 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.646 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.646 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.646 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.646 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.647 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.647 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.647 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.647 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.647 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.647 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.647 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.648 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.648 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.648 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.648 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.648 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.648 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.648 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.649 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.649 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.649 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.649 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.649 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.649 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.650 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.650 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.650 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.650 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.650 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.650 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.650 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.651 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.651 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.651 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.651 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.651 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.651 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.651 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.651 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.651 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.652 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.652 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.652 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.652 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.652 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.652 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.652 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.653 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.653 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.653 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.653 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.653 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.653 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.653 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.653 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.654 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.654 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.654 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.654 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.654 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.654 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.654 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.654 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.655 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.655 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.655 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.655 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.655 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.655 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.655 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.655 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.655 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.656 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.656 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.656 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.656 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.656 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.656 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.656 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.656 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.657 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.657 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.657 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.657 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.657 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.657 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.657 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.657 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.657 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.658 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.658 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.658 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.658 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.658 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.658 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.658 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.659 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.659 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.659 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.659 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.659 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.659 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.659 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.659 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.660 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.660 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.660 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.660 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.660 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.660 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.660 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.661 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.661 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.661 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.661 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.661 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.662 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.662 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.662 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.662 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.662 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.663 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.663 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.663 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.663 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.663 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.664 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.664 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.664 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.664 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.664 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.664 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.665 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.665 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.665 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.665 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.665 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.665 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.666 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.666 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.666 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.666 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.666 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.667 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.667 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.667 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.667 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.667 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.667 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.668 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.668 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.668 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.668 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.668 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.669 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.669 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.669 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.669 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.669 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.669 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.670 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.670 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.670 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.670 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.671 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.671 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.671 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.671 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.671 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.672 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.672 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.672 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.672 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.672 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.673 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.673 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.673 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.673 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.673 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.673 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.674 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.674 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.674 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.674 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.674 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.674 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.674 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.675 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.675 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.675 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.675 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.675 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.675 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.676 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.676 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.676 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.676 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.676 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.677 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.677 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.677 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.677 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.677 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.678 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.678 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.678 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.678 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.678 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.679 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.679 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.679 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.679 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.679 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.679 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.680 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.680 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.680 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.680 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.680 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.681 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.681 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.681 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.681 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.681 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.682 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.682 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.682 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.682 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.682 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.682 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.683 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.683 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.683 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.683 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.683 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.684 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.684 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.684 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.684 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.684 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.685 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.685 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.685 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.685 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.685 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.686 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.686 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.686 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.686 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.687 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.687 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.687 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.687 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.687 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.688 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.688 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.688 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.688 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.688 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.689 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.689 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.689 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.689 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.689 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.689 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.690 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.690 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.690 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.690 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.690 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.690 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.691 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.691 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.691 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.691 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.691 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.691 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.691 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.691 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.691 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.692 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.692 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.692 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.692 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.692 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.692 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.692 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.692 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.693 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.693 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.693 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.693 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.693 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.693 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.693 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.693 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.693 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.694 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.694 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.694 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.694 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.694 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.694 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.694 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.694 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.694 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.695 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.695 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.695 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.695 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.695 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.695 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.695 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.695 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.696 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.696 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.696 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.696 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.696 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.696 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.696 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.696 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.697 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.697 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.697 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.697 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.697 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.697 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.697 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.697 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.697 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.698 185296 DEBUG oslo_service.backend._eventlet.service [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Feb 19 19:14:50 compute-0 nova_compute[185292]: 2026-02-19 19:14:50.698 185296 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Feb 19 19:14:50 compute-0 python3.9[186037]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:14:51 compute-0 nova_compute[185292]: 2026-02-19 19:14:51.207 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Feb 19 19:14:51 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 19 19:14:51 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 19 19:14:51 compute-0 nova_compute[185292]: 2026-02-19 19:14:51.272 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fab45169220> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Feb 19 19:14:51 compute-0 nova_compute[185292]: libvirt:  error : internal error: could not initialize domain event timer
Feb 19 19:14:51 compute-0 nova_compute[185292]: 2026-02-19 19:14:51.272 185296 WARNING nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Feb 19 19:14:51 compute-0 nova_compute[185292]: 2026-02-19 19:14:51.273 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fab45169220> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Feb 19 19:14:51 compute-0 nova_compute[185292]: 2026-02-19 19:14:51.274 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Feb 19 19:14:51 compute-0 nova_compute[185292]: 2026-02-19 19:14:51.275 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Feb 19 19:14:51 compute-0 nova_compute[185292]: 2026-02-19 19:14:51.275 185296 INFO nova.utils [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] The default thread pool MainProcess.default is initialized
Feb 19 19:14:51 compute-0 nova_compute[185292]: 2026-02-19 19:14:51.276 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Feb 19 19:14:51 compute-0 nova_compute[185292]: 2026-02-19 19:14:51.276 185296 INFO nova.virt.libvirt.driver [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Connection event '1' reason 'None'
Feb 19 19:14:51 compute-0 python3.9[186231]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:14:51 compute-0 nova_compute[185292]: 2026-02-19 19:14:51.783 185296 WARNING nova.virt.libvirt.driver [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 19 19:14:51 compute-0 nova_compute[185292]: 2026-02-19 19:14:51.784 185296 DEBUG nova.virt.libvirt.volume.mount [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.116 185296 INFO nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Libvirt host capabilities <capabilities>
Feb 19 19:14:52 compute-0 nova_compute[185292]: 
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <host>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <uuid>30a688d1-6db8-4679-8c84-eeb6a5211fb8</uuid>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <cpu>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <arch>x86_64</arch>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model>EPYC-Rome-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <vendor>AMD</vendor>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <microcode version='16777317'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <signature family='23' model='49' stepping='0'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='x2apic'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='tsc-deadline'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='osxsave'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='hypervisor'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='tsc_adjust'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='spec-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='stibp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='arch-capabilities'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='cmp_legacy'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='topoext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='virt-ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='lbrv'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='tsc-scale'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='vmcb-clean'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='pause-filter'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='pfthreshold'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='svme-addr-chk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='rdctl-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='skip-l1dfl-vmentry'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='mds-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature name='pschange-mc-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <pages unit='KiB' size='4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <pages unit='KiB' size='2048'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <pages unit='KiB' size='1048576'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </cpu>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <power_management>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <suspend_mem/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <suspend_disk/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <suspend_hybrid/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </power_management>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <iommu support='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <migration_features>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <live/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <uri_transports>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <uri_transport>tcp</uri_transport>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <uri_transport>rdma</uri_transport>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </uri_transports>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </migration_features>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <topology>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <cells num='1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <cell id='0'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:           <memory unit='KiB'>7864292</memory>
Feb 19 19:14:52 compute-0 nova_compute[185292]:           <pages unit='KiB' size='4'>1966073</pages>
Feb 19 19:14:52 compute-0 nova_compute[185292]:           <pages unit='KiB' size='2048'>0</pages>
Feb 19 19:14:52 compute-0 nova_compute[185292]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 19 19:14:52 compute-0 nova_compute[185292]:           <distances>
Feb 19 19:14:52 compute-0 nova_compute[185292]:             <sibling id='0' value='10'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:           </distances>
Feb 19 19:14:52 compute-0 nova_compute[185292]:           <cpus num='8'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:           </cpus>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         </cell>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </cells>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </topology>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <cache>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </cache>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <secmodel>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model>selinux</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <doi>0</doi>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </secmodel>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <secmodel>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model>dac</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <doi>0</doi>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </secmodel>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </host>
Feb 19 19:14:52 compute-0 nova_compute[185292]: 
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <guest>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <os_type>hvm</os_type>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <arch name='i686'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <wordsize>32</wordsize>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <domain type='qemu'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <domain type='kvm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </arch>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <features>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <pae/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <nonpae/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <acpi default='on' toggle='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <apic default='on' toggle='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <cpuselection/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <deviceboot/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <disksnapshot default='on' toggle='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <externalSnapshot/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </features>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </guest>
Feb 19 19:14:52 compute-0 nova_compute[185292]: 
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <guest>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <os_type>hvm</os_type>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <arch name='x86_64'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <wordsize>64</wordsize>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <domain type='qemu'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <domain type='kvm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </arch>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <features>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <acpi default='on' toggle='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <apic default='on' toggle='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <cpuselection/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <deviceboot/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <disksnapshot default='on' toggle='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <externalSnapshot/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </features>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </guest>
Feb 19 19:14:52 compute-0 nova_compute[185292]: 
Feb 19 19:14:52 compute-0 nova_compute[185292]: </capabilities>
Feb 19 19:14:52 compute-0 nova_compute[185292]: 
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.127 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.144 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 19 19:14:52 compute-0 nova_compute[185292]: <domainCapabilities>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <path>/usr/libexec/qemu-kvm</path>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <domain>kvm</domain>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <arch>i686</arch>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <vcpu max='240'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <iothreads supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <os supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <enum name='firmware'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <loader supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>rom</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pflash</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='readonly'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>yes</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>no</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='secure'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>no</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </loader>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </os>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <cpu>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='host-passthrough' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='hostPassthroughMigratable'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>on</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>off</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='maximum' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='maximumMigratable'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>on</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>off</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='host-model' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <vendor>AMD</vendor>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='x2apic'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='tsc-deadline'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='hypervisor'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='tsc_adjust'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='spec-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='stibp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='cmp_legacy'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='overflow-recov'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='succor'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='amd-ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='virt-ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='lbrv'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='tsc-scale'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='vmcb-clean'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='flushbyasid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='pause-filter'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='pfthreshold'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='svme-addr-chk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='disable' name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='custom' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='ClearwaterForest'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ddpd-u'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sha512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm3'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='ClearwaterForest-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ddpd-u'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sha512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm3'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cooperlake'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cooperlake-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cooperlake-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Dhyana-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Genoa'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Genoa-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Genoa-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='perfmon-v2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Turin'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='perfmon-v2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbpb'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Turin-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='perfmon-v2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbpb'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-128'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-256'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-128'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-256'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v6'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v7'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='KnightsMill'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512er'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512pf'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='KnightsMill-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512er'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512pf'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G4-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tbm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G5-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tbm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='athlon'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='athlon-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='core2duo'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='core2duo-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='coreduo'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='coreduo-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='n270'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='n270-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='phenom'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='phenom-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </cpu>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <memoryBacking supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <enum name='sourceType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>file</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>anonymous</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>memfd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </memoryBacking>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <devices>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <disk supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='diskDevice'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>disk</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>cdrom</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>floppy</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>lun</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='bus'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>ide</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>fdc</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>scsi</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>usb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>sata</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-non-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </disk>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <graphics supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vnc</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>egl-headless</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>dbus</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </graphics>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <video supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='modelType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vga</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>cirrus</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>none</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>bochs</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>ramfb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </video>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <hostdev supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='mode'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>subsystem</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='startupPolicy'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>default</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>mandatory</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>requisite</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>optional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='subsysType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>usb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pci</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>scsi</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='capsType'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='pciBackend'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </hostdev>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <rng supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-non-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendModel'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>random</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>egd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>builtin</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </rng>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <filesystem supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='driverType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>path</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>handle</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtiofs</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </filesystem>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <tpm supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tpm-tis</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tpm-crb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendModel'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>emulator</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>external</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendVersion'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>2.0</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </tpm>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <redirdev supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='bus'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>usb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </redirdev>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <channel supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pty</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>unix</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </channel>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <crypto supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>qemu</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendModel'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>builtin</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </crypto>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <interface supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>default</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>passt</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </interface>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <panic supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>isa</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>hyperv</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </panic>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <console supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>null</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vc</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pty</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>dev</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>file</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pipe</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>stdio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>udp</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tcp</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>unix</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>qemu-vdagent</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>dbus</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </console>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </devices>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <features>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <gic supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <vmcoreinfo supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <genid supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <backingStoreInput supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <backup supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <async-teardown supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <s390-pv supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <ps2 supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <tdx supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <sev supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <sgx supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <hyperv supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='features'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>relaxed</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vapic</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>spinlocks</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vpindex</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>runtime</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>synic</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>stimer</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>reset</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vendor_id</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>frequencies</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>reenlightenment</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tlbflush</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>ipi</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>avic</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>emsr_bitmap</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>xmm_input</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <defaults>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <spinlocks>4095</spinlocks>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <stimer_direct>on</stimer_direct>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <tlbflush_direct>on</tlbflush_direct>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <tlbflush_extended>on</tlbflush_extended>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </defaults>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </hyperv>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <launchSecurity supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </features>
Feb 19 19:14:52 compute-0 nova_compute[185292]: </domainCapabilities>
Feb 19 19:14:52 compute-0 nova_compute[185292]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.153 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 19 19:14:52 compute-0 nova_compute[185292]: <domainCapabilities>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <path>/usr/libexec/qemu-kvm</path>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <domain>kvm</domain>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <arch>i686</arch>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <vcpu max='4096'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <iothreads supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <os supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <enum name='firmware'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <loader supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>rom</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pflash</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='readonly'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>yes</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>no</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='secure'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>no</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </loader>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </os>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <cpu>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='host-passthrough' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='hostPassthroughMigratable'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>on</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>off</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='maximum' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='maximumMigratable'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>on</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>off</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='host-model' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <vendor>AMD</vendor>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='x2apic'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='tsc-deadline'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='hypervisor'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='tsc_adjust'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='spec-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='stibp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='cmp_legacy'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='overflow-recov'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='succor'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='amd-ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='virt-ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='lbrv'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='tsc-scale'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='vmcb-clean'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='flushbyasid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='pause-filter'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='pfthreshold'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='svme-addr-chk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='disable' name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='custom' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='ClearwaterForest'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ddpd-u'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sha512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm3'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='ClearwaterForest-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ddpd-u'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sha512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm3'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cooperlake'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cooperlake-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cooperlake-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Dhyana-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Genoa'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Genoa-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Genoa-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='perfmon-v2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Turin'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='perfmon-v2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbpb'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Turin-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='perfmon-v2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbpb'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-128'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-256'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-128'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-256'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v6'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v7'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='KnightsMill'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512er'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 sudo[186401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkfzrrjmwcxhplrsgfpxhgjllvggyoud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528491.7971702-2676-47444655180677/AnsiballZ_podman_container.py'
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512pf'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='KnightsMill-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512er'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512pf'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G4-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tbm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G5-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tbm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 sudo[186401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='athlon'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='athlon-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='core2duo'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='core2duo-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='coreduo'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='coreduo-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='n270'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='n270-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='phenom'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='phenom-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </cpu>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <memoryBacking supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <enum name='sourceType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>file</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>anonymous</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>memfd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </memoryBacking>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <devices>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <disk supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='diskDevice'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>disk</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>cdrom</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>floppy</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>lun</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='bus'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>fdc</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>scsi</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>usb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>sata</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-non-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </disk>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <graphics supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vnc</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>egl-headless</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>dbus</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </graphics>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <video supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='modelType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vga</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>cirrus</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>none</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>bochs</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>ramfb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </video>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <hostdev supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='mode'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>subsystem</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='startupPolicy'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>default</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>mandatory</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>requisite</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>optional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='subsysType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>usb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pci</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>scsi</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='capsType'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='pciBackend'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </hostdev>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <rng supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-non-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendModel'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>random</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>egd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>builtin</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </rng>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <filesystem supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='driverType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>path</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>handle</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtiofs</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </filesystem>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <tpm supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tpm-tis</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tpm-crb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendModel'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>emulator</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>external</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendVersion'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>2.0</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </tpm>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <redirdev supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='bus'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>usb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </redirdev>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <channel supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pty</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>unix</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </channel>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <crypto supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>qemu</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendModel'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>builtin</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </crypto>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <interface supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>default</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>passt</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </interface>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <panic supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>isa</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>hyperv</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </panic>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <console supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>null</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vc</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pty</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>dev</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>file</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pipe</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>stdio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>udp</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tcp</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>unix</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>qemu-vdagent</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>dbus</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </console>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </devices>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <features>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <gic supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <vmcoreinfo supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <genid supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <backingStoreInput supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <backup supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <async-teardown supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <s390-pv supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <ps2 supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <tdx supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <sev supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <sgx supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <hyperv supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='features'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>relaxed</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vapic</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>spinlocks</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vpindex</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>runtime</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>synic</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>stimer</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>reset</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vendor_id</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>frequencies</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>reenlightenment</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tlbflush</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>ipi</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>avic</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>emsr_bitmap</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>xmm_input</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <defaults>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <spinlocks>4095</spinlocks>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <stimer_direct>on</stimer_direct>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <tlbflush_direct>on</tlbflush_direct>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <tlbflush_extended>on</tlbflush_extended>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </defaults>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </hyperv>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <launchSecurity supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </features>
Feb 19 19:14:52 compute-0 nova_compute[185292]: </domainCapabilities>
Feb 19 19:14:52 compute-0 nova_compute[185292]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.231 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.235 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 19 19:14:52 compute-0 nova_compute[185292]: <domainCapabilities>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <path>/usr/libexec/qemu-kvm</path>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <domain>kvm</domain>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <arch>x86_64</arch>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <vcpu max='240'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <iothreads supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <os supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <enum name='firmware'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <loader supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>rom</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pflash</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='readonly'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>yes</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>no</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='secure'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>no</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </loader>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </os>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <cpu>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='host-passthrough' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='hostPassthroughMigratable'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>on</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>off</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='maximum' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='maximumMigratable'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>on</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>off</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='host-model' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <vendor>AMD</vendor>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='x2apic'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='tsc-deadline'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='hypervisor'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='tsc_adjust'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='spec-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='stibp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='cmp_legacy'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='overflow-recov'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='succor'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='amd-ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='virt-ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='lbrv'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='tsc-scale'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='vmcb-clean'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='flushbyasid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='pause-filter'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='pfthreshold'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='svme-addr-chk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='disable' name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='custom' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='ClearwaterForest'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ddpd-u'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sha512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm3'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='ClearwaterForest-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ddpd-u'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sha512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm3'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cooperlake'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cooperlake-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cooperlake-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Dhyana-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Genoa'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Genoa-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Genoa-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='perfmon-v2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Turin'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='perfmon-v2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbpb'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Turin-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='perfmon-v2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbpb'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-128'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-256'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-128'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-256'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v6'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v7'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='KnightsMill'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512er'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512pf'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='KnightsMill-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512er'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512pf'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G4-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tbm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G5-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tbm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='athlon'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='athlon-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='core2duo'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='core2duo-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='coreduo'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='coreduo-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='n270'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='n270-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='phenom'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='phenom-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </cpu>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <memoryBacking supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <enum name='sourceType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>file</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>anonymous</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>memfd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </memoryBacking>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <devices>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <disk supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='diskDevice'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>disk</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>cdrom</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>floppy</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>lun</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='bus'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>ide</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>fdc</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>scsi</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>usb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>sata</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-non-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </disk>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <graphics supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vnc</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>egl-headless</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>dbus</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </graphics>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <video supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='modelType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vga</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>cirrus</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>none</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>bochs</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>ramfb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </video>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <hostdev supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='mode'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>subsystem</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='startupPolicy'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>default</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>mandatory</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>requisite</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>optional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='subsysType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>usb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pci</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>scsi</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='capsType'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='pciBackend'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </hostdev>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <rng supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-non-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendModel'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>random</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>egd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>builtin</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </rng>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <filesystem supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='driverType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>path</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>handle</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtiofs</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </filesystem>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <tpm supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tpm-tis</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tpm-crb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendModel'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>emulator</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>external</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendVersion'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>2.0</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </tpm>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <redirdev supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='bus'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>usb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </redirdev>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <channel supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pty</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>unix</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </channel>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <crypto supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>qemu</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendModel'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>builtin</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </crypto>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <interface supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>default</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>passt</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </interface>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <panic supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>isa</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>hyperv</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </panic>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <console supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>null</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vc</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pty</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>dev</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>file</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pipe</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>stdio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>udp</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tcp</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>unix</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>qemu-vdagent</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>dbus</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </console>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </devices>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <features>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <gic supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <vmcoreinfo supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <genid supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <backingStoreInput supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <backup supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <async-teardown supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <s390-pv supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <ps2 supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <tdx supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <sev supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <sgx supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <hyperv supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='features'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>relaxed</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vapic</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>spinlocks</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vpindex</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>runtime</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>synic</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>stimer</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>reset</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vendor_id</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>frequencies</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>reenlightenment</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tlbflush</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>ipi</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>avic</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>emsr_bitmap</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>xmm_input</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <defaults>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <spinlocks>4095</spinlocks>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <stimer_direct>on</stimer_direct>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <tlbflush_direct>on</tlbflush_direct>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <tlbflush_extended>on</tlbflush_extended>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </defaults>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </hyperv>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <launchSecurity supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </features>
Feb 19 19:14:52 compute-0 nova_compute[185292]: </domainCapabilities>
Feb 19 19:14:52 compute-0 nova_compute[185292]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.296 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 19 19:14:52 compute-0 nova_compute[185292]: <domainCapabilities>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <path>/usr/libexec/qemu-kvm</path>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <domain>kvm</domain>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <arch>x86_64</arch>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <vcpu max='4096'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <iothreads supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <os supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <enum name='firmware'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>efi</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <loader supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>rom</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pflash</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='readonly'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>yes</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>no</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='secure'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>yes</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>no</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </loader>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </os>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <cpu>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='host-passthrough' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='hostPassthroughMigratable'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>on</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>off</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='maximum' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='maximumMigratable'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>on</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>off</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='host-model' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <vendor>AMD</vendor>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='x2apic'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='tsc-deadline'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='hypervisor'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='tsc_adjust'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='spec-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='stibp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='cmp_legacy'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='overflow-recov'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='succor'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='amd-ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='virt-ssbd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='lbrv'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='tsc-scale'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='vmcb-clean'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='flushbyasid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='pause-filter'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='pfthreshold'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='svme-addr-chk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <feature policy='disable' name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <mode name='custom' supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Broadwell-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cascadelake-Server-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='ClearwaterForest'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ddpd-u'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sha512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm3'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='ClearwaterForest-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ddpd-u'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sha512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm3'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sm4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cooperlake'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cooperlake-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Cooperlake-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Denverton-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Dhyana-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Genoa'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Genoa-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Genoa-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='perfmon-v2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Milan-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Rome-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Turin'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='perfmon-v2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbpb'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-Turin-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amd-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='auto-ibrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='perfmon-v2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbpb'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='stibp-always-on'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='EPYC-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-128'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-256'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='GraniteRapids-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-128'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-256'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx10-512'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='prefetchiti'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Haswell-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-noTSX'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v6'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Icelake-Server-v7'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='IvyBridge-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='KnightsMill'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512er'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512pf'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='KnightsMill-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512er'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512pf'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G4-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tbm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Opteron_G5-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fma4'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tbm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xop'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SapphireRapids-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='amx-tile'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-bf16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-fp16'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bitalg'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrc'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fzrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='la57'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='taa-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='SierraForest-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ifma'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cmpccxadd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fbsdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='fsrs'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ibrs-all'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='intel-psfd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='lam'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mcdt-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pbrsb-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='psdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='serialize'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vaes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Client-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='hle'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='rtm'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Skylake-Server-v5'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512bw'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512cd'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512dq'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512f'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='avx512vl'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='invpcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pcid'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='pku'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='mpx'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v2'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v3'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='core-capability'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='split-lock-detect'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='Snowridge-v4'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='cldemote'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='erms'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='gfni'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdir64b'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='movdiri'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='xsaves'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='athlon'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='athlon-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='core2duo'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='core2duo-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='coreduo'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='coreduo-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='n270'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='n270-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='ss'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='phenom'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <blockers model='phenom-v1'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnow'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <feature name='3dnowext'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </blockers>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </mode>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </cpu>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <memoryBacking supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <enum name='sourceType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>file</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>anonymous</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <value>memfd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </memoryBacking>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <devices>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <disk supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='diskDevice'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>disk</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>cdrom</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>floppy</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>lun</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='bus'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>fdc</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>scsi</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>usb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>sata</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-non-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </disk>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <graphics supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vnc</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>egl-headless</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>dbus</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </graphics>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <video supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='modelType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vga</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>cirrus</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>none</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>bochs</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>ramfb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </video>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <hostdev supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='mode'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>subsystem</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='startupPolicy'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>default</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>mandatory</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>requisite</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>optional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='subsysType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>usb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pci</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>scsi</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='capsType'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='pciBackend'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </hostdev>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <rng supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtio-non-transitional</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendModel'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>random</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>egd</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>builtin</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </rng>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <filesystem supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='driverType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>path</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>handle</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>virtiofs</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </filesystem>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <tpm supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tpm-tis</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tpm-crb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendModel'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>emulator</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>external</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendVersion'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>2.0</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </tpm>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <redirdev supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='bus'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>usb</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </redirdev>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <channel supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pty</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>unix</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </channel>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <crypto supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>qemu</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendModel'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>builtin</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </crypto>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <interface supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='backendType'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>default</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>passt</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </interface>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <panic supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='model'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>isa</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>hyperv</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </panic>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <console supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='type'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>null</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vc</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pty</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>dev</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>file</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>pipe</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>stdio</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>udp</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tcp</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>unix</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>qemu-vdagent</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>dbus</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </console>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </devices>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <features>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <gic supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <vmcoreinfo supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <genid supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <backingStoreInput supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <backup supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <async-teardown supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <s390-pv supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <ps2 supported='yes'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <tdx supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <sev supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <sgx supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <hyperv supported='yes'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <enum name='features'>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>relaxed</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vapic</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>spinlocks</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vpindex</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>runtime</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>synic</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>stimer</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>reset</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>vendor_id</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>frequencies</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>reenlightenment</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>tlbflush</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>ipi</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>avic</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>emsr_bitmap</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <value>xmm_input</value>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </enum>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       <defaults>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <spinlocks>4095</spinlocks>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <stimer_direct>on</stimer_direct>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <tlbflush_direct>on</tlbflush_direct>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <tlbflush_extended>on</tlbflush_extended>
Feb 19 19:14:52 compute-0 nova_compute[185292]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 19 19:14:52 compute-0 nova_compute[185292]:       </defaults>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     </hyperv>
Feb 19 19:14:52 compute-0 nova_compute[185292]:     <launchSecurity supported='no'/>
Feb 19 19:14:52 compute-0 nova_compute[185292]:   </features>
Feb 19 19:14:52 compute-0 nova_compute[185292]: </domainCapabilities>
Feb 19 19:14:52 compute-0 nova_compute[185292]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.356 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.356 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.357 185296 DEBUG nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.364 185296 INFO nova.virt.libvirt.host [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Secure Boot support detected
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.368 185296 INFO nova.virt.libvirt.driver [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.369 185296 INFO nova.virt.libvirt.driver [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.459 185296 DEBUG nova.virt.libvirt.driver [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] cpu compare xml: <cpu match="exact">
Feb 19 19:14:52 compute-0 nova_compute[185292]:   <model>Nehalem</model>
Feb 19 19:14:52 compute-0 nova_compute[185292]: </cpu>
Feb 19 19:14:52 compute-0 nova_compute[185292]:  _compare_cpu /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10922
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.461 185296 DEBUG nova.virt.libvirt.driver [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Feb 19 19:14:52 compute-0 python3.9[186404]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 19 19:14:52 compute-0 sudo[186401]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:52 compute-0 rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 19:14:52 compute-0 nova_compute[185292]: 2026-02-19 19:14:52.971 185296 INFO nova.virt.node [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Determined node identity 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 from /var/lib/nova/compute_id
Feb 19 19:14:53 compute-0 nova_compute[185292]: 2026-02-19 19:14:53.479 185296 WARNING nova.compute.manager [None req-db812451-90b4-43ba-9f92-00a85483fbe3 - - - - - -] Compute nodes ['11ecaf50-b8a2-48b5-a41c-a8b0b10798d6'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Feb 19 19:14:54 compute-0 sudo[186599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsggtthcjglkkcorzoibmrtijxpacgtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528493.823302-2692-33565738243591/AnsiballZ_systemd.py'
Feb 19 19:14:54 compute-0 sudo[186599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:54 compute-0 python3.9[186602]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:14:54 compute-0 systemd[1]: Stopping nova_compute container...
Feb 19 19:14:54 compute-0 nova_compute[185292]: 2026-02-19 19:14:54.401 185296 DEBUG oslo_concurrency.lockutils [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:14:54 compute-0 nova_compute[185292]: 2026-02-19 19:14:54.401 185296 DEBUG oslo_concurrency.lockutils [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:14:54 compute-0 nova_compute[185292]: 2026-02-19 19:14:54.401 185296 DEBUG oslo_concurrency.lockutils [None req-a5efac37-7a46-4212-9f09-7d4e314199d4 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:14:54 compute-0 virtqemud[186157]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 19 19:14:54 compute-0 virtqemud[186157]: hostname: compute-0
Feb 19 19:14:54 compute-0 virtqemud[186157]: End of file while reading data: Input/output error
Feb 19 19:14:54 compute-0 systemd[1]: libpod-faf00e9823aab32743a3c5e8285c40c46929d8e72d00f7dbe49efb141f3bd600.scope: Deactivated successfully.
Feb 19 19:14:54 compute-0 systemd[1]: libpod-faf00e9823aab32743a3c5e8285c40c46929d8e72d00f7dbe49efb141f3bd600.scope: Consumed 2.772s CPU time.
Feb 19 19:14:54 compute-0 podman[186606]: 2026-02-19 19:14:54.840459639 +0000 UTC m=+0.467259721 container died faf00e9823aab32743a3c5e8285c40c46929d8e72d00f7dbe49efb141f3bd600 (image=38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=nova_compute, container_name=nova_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Feb 19 19:14:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-faf00e9823aab32743a3c5e8285c40c46929d8e72d00f7dbe49efb141f3bd600-userdata-shm.mount: Deactivated successfully.
Feb 19 19:14:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-aff0712aec6d7b9f713f644b07baef62f58430fe3526737610d4561e67a69608-merged.mount: Deactivated successfully.
Feb 19 19:14:54 compute-0 podman[186606]: 2026-02-19 19:14:54.884315645 +0000 UTC m=+0.511115697 container cleanup faf00e9823aab32743a3c5e8285c40c46929d8e72d00f7dbe49efb141f3bd600 (image=38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=nova_compute, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:14:54 compute-0 podman[186606]: nova_compute
Feb 19 19:14:54 compute-0 podman[186634]: nova_compute
Feb 19 19:14:54 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 19 19:14:54 compute-0 systemd[1]: Stopped nova_compute container.
Feb 19 19:14:54 compute-0 systemd[1]: Starting nova_compute container...
Feb 19 19:14:55 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:14:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aff0712aec6d7b9f713f644b07baef62f58430fe3526737610d4561e67a69608/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 19 19:14:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aff0712aec6d7b9f713f644b07baef62f58430fe3526737610d4561e67a69608/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 19 19:14:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aff0712aec6d7b9f713f644b07baef62f58430fe3526737610d4561e67a69608/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 19 19:14:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aff0712aec6d7b9f713f644b07baef62f58430fe3526737610d4561e67a69608/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 19 19:14:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aff0712aec6d7b9f713f644b07baef62f58430fe3526737610d4561e67a69608/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 19 19:14:55 compute-0 podman[186647]: 2026-02-19 19:14:55.080500811 +0000 UTC m=+0.104545477 container init faf00e9823aab32743a3c5e8285c40c46929d8e72d00f7dbe49efb141f3bd600 (image=38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, config_id=nova_compute, org.label-schema.build-date=20260216)
Feb 19 19:14:55 compute-0 podman[186647]: 2026-02-19 19:14:55.091279875 +0000 UTC m=+0.115324451 container start faf00e9823aab32743a3c5e8285c40c46929d8e72d00f7dbe49efb141f3bd600 (image=38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.build-date=20260216, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute)
Feb 19 19:14:55 compute-0 podman[186647]: nova_compute
Feb 19 19:14:55 compute-0 systemd[1]: Started nova_compute container.
Feb 19 19:14:55 compute-0 nova_compute[186662]: + sudo -E kolla_set_configs
Feb 19 19:14:55 compute-0 sudo[186599]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Validating config file
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Copying service configuration files
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Deleting /etc/ceph
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Creating directory /etc/ceph
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Setting permission for /etc/ceph
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Writing out command to execute
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 19 19:14:55 compute-0 nova_compute[186662]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 19 19:14:55 compute-0 nova_compute[186662]: ++ cat /run_command
Feb 19 19:14:55 compute-0 nova_compute[186662]: + CMD=nova-compute
Feb 19 19:14:55 compute-0 nova_compute[186662]: + ARGS=
Feb 19 19:14:55 compute-0 nova_compute[186662]: + sudo kolla_copy_cacerts
Feb 19 19:14:55 compute-0 nova_compute[186662]: + [[ ! -n '' ]]
Feb 19 19:14:55 compute-0 nova_compute[186662]: + . kolla_extend_start
Feb 19 19:14:55 compute-0 nova_compute[186662]: + echo 'Running command: '\''nova-compute'\'''
Feb 19 19:14:55 compute-0 nova_compute[186662]: Running command: 'nova-compute'
Feb 19 19:14:55 compute-0 nova_compute[186662]: + umask 0022
Feb 19 19:14:55 compute-0 nova_compute[186662]: + exec nova-compute
Feb 19 19:14:55 compute-0 sudo[186823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjokxwcdlvkdclwutjoujflvswyvsneo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528495.3118062-2710-124563820759721/AnsiballZ_podman_container.py'
Feb 19 19:14:55 compute-0 sudo[186823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:14:55 compute-0 python3.9[186826]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 19 19:14:56 compute-0 systemd[1]: Started libpod-conmon-6f7f6b29aa5c747c376dd46dc00fbc27e3bc82fd22a26f9c73cffe17e8c7ac8c.scope.
Feb 19 19:14:56 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:14:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11060ae14498fc741c28d209fe9da492ee37f1c5bacd74cb87310ac3223f4ef4/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 19 19:14:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11060ae14498fc741c28d209fe9da492ee37f1c5bacd74cb87310ac3223f4ef4/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 19 19:14:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11060ae14498fc741c28d209fe9da492ee37f1c5bacd74cb87310ac3223f4ef4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 19 19:14:56 compute-0 podman[186851]: 2026-02-19 19:14:56.348930221 +0000 UTC m=+0.139557094 container init 6f7f6b29aa5c747c376dd46dc00fbc27e3bc82fd22a26f9c73cffe17e8c7ac8c (image=38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Feb 19 19:14:56 compute-0 podman[186851]: 2026-02-19 19:14:56.355308749 +0000 UTC m=+0.145935622 container start 6f7f6b29aa5c747c376dd46dc00fbc27e3bc82fd22a26f9c73cffe17e8c7ac8c (image=38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, container_name=nova_compute_init)
Feb 19 19:14:56 compute-0 python3.9[186826]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Applying nova statedir ownership
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 19 19:14:56 compute-0 nova_compute_init[186873]: INFO:nova_statedir:Nova statedir ownership complete
Feb 19 19:14:56 compute-0 systemd[1]: libpod-6f7f6b29aa5c747c376dd46dc00fbc27e3bc82fd22a26f9c73cffe17e8c7ac8c.scope: Deactivated successfully.
Feb 19 19:14:56 compute-0 podman[186887]: 2026-02-19 19:14:56.436739807 +0000 UTC m=+0.022457786 container died 6f7f6b29aa5c747c376dd46dc00fbc27e3bc82fd22a26f9c73cffe17e8c7ac8c (image=38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_id=nova_compute_init, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Feb 19 19:14:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6f7f6b29aa5c747c376dd46dc00fbc27e3bc82fd22a26f9c73cffe17e8c7ac8c-userdata-shm.mount: Deactivated successfully.
Feb 19 19:14:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-11060ae14498fc741c28d209fe9da492ee37f1c5bacd74cb87310ac3223f4ef4-merged.mount: Deactivated successfully.
Feb 19 19:14:56 compute-0 podman[186887]: 2026-02-19 19:14:56.468270758 +0000 UTC m=+0.053988687 container cleanup 6f7f6b29aa5c747c376dd46dc00fbc27e3bc82fd22a26f9c73cffe17e8c7ac8c (image=38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_id=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5341527c8dd34d6abf221d866e12a549ca4c7217b5db19aa77437583172ec720'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 19:14:56 compute-0 systemd[1]: libpod-conmon-6f7f6b29aa5c747c376dd46dc00fbc27e3bc82fd22a26f9c73cffe17e8c7ac8c.scope: Deactivated successfully.
Feb 19 19:14:56 compute-0 sudo[186823]: pam_unix(sudo:session): session closed for user root
Feb 19 19:14:57 compute-0 nova_compute[186662]: 2026-02-19 19:14:57.423 186666 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Feb 19 19:14:57 compute-0 nova_compute[186662]: 2026-02-19 19:14:57.423 186666 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Feb 19 19:14:57 compute-0 nova_compute[186662]: 2026-02-19 19:14:57.423 186666 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Feb 19 19:14:57 compute-0 nova_compute[186662]: 2026-02-19 19:14:57.423 186666 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 19 19:14:57 compute-0 sshd-session[161675]: Connection closed by 192.168.122.30 port 56434
Feb 19 19:14:57 compute-0 sshd-session[161672]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:14:57 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Feb 19 19:14:57 compute-0 systemd[1]: session-24.scope: Consumed 1min 23.397s CPU time.
Feb 19 19:14:57 compute-0 systemd-logind[822]: Session 24 logged out. Waiting for processes to exit.
Feb 19 19:14:57 compute-0 systemd-logind[822]: Removed session 24.
Feb 19 19:14:57 compute-0 nova_compute[186662]: 2026-02-19 19:14:57.535 186666 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:14:57 compute-0 nova_compute[186662]: 2026-02-19 19:14:57.547 186666 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:14:57 compute-0 nova_compute[186662]: 2026-02-19 19:14:57.547 186666 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Feb 19 19:14:57 compute-0 nova_compute[186662]: 2026-02-19 19:14:57.574 186666 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Feb 19 19:14:57 compute-0 nova_compute[186662]: 2026-02-19 19:14:57.575 186666 WARNING oslo_config.cfg [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Feb 19 19:14:58 compute-0 nova_compute[186662]: 2026-02-19 19:14:58.490 186666 INFO nova.virt.driver [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 19 19:14:58 compute-0 nova_compute[186662]: 2026-02-19 19:14:58.561 186666 INFO nova.compute.provider_config [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.067 186666 DEBUG oslo_concurrency.lockutils [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.068 186666 DEBUG oslo_concurrency.lockutils [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.069 186666 DEBUG oslo_concurrency.lockutils [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.069 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.070 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.070 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.070 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.071 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.071 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.071 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.072 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.072 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.072 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.072 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.073 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.073 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.073 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.074 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.074 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.074 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.075 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.075 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.076 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.076 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.076 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.077 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.077 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.077 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.078 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.078 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.078 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.079 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.079 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.080 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.080 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.080 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.081 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.081 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.081 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.082 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.082 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.082 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.083 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.083 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.084 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.084 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.084 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.085 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.085 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.086 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.086 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.086 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.087 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.087 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.087 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.088 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.088 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.089 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.089 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.089 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.090 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.090 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.090 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.091 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.091 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.091 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.092 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.092 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.092 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.093 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.093 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.093 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.094 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.094 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.094 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.095 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.095 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.096 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.096 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.096 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.097 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.097 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.098 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.098 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.098 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.099 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.099 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.100 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.100 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.100 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.101 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.101 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.102 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.102 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.102 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.103 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.103 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.103 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.104 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.104 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.104 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.105 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.105 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.105 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.106 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.106 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.107 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.107 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.107 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.108 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.108 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.108 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.109 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.109 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.110 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.110 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.110 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.111 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.111 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.111 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.112 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.112 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.112 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.113 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.113 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.113 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.113 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.114 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.114 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.114 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.114 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.114 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.115 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.115 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.115 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.115 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.115 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.116 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.116 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.116 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.116 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.117 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.117 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.117 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.117 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.117 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.118 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.118 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.118 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.118 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.119 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.119 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.119 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.119 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.120 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.120 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.120 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.120 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.120 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.121 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.121 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.121 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.121 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.122 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.122 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.122 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.122 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.122 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.123 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.123 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.123 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.123 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.124 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.124 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.124 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.124 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.124 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.125 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.125 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.125 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.126 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.126 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.126 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.126 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.126 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.127 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.127 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.127 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.127 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.128 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.128 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.128 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.128 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.129 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.129 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.129 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.129 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.129 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.130 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.130 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.130 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.130 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.131 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.131 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.131 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.131 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.131 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.132 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.132 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.132 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.132 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.133 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.133 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.133 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.133 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.133 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.134 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.134 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.134 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.134 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.134 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.135 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.135 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.135 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.135 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.136 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.136 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.136 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.136 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.137 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.137 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.137 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.137 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.137 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.138 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.138 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.138 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.138 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.139 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.139 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.139 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.139 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.140 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.140 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.140 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.141 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.141 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.141 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.141 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.141 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.142 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.142 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.142 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.142 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.143 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.143 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.143 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.143 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.143 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.144 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.144 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.144 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.144 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.145 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.145 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.145 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.145 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.145 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.146 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.146 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.146 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.146 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.146 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.147 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.147 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.147 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.147 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.148 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.148 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.148 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.148 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.148 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.149 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.149 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.149 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.149 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.150 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.150 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.150 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.150 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.150 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.151 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.151 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.151 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.151 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.152 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.152 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.152 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.152 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.152 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.153 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.153 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.153 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.153 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.154 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.154 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.154 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.154 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.154 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.155 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.155 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.155 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.155 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.156 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.156 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.156 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.156 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.156 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.156 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.157 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.157 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.157 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.157 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.157 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.157 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.157 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.158 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.158 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.158 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.158 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.158 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.158 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.159 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.159 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.159 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.159 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.159 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.160 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.160 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.160 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.160 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.160 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.160 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.160 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.161 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.161 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.161 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.161 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.161 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.161 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.162 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.162 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.162 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.162 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.162 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.162 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.162 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.163 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.163 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.163 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.163 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.163 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.163 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.164 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.164 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.164 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.164 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.164 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.164 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.165 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.165 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.165 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.165 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.165 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.165 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.166 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.166 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.166 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.166 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.166 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.166 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.167 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.167 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.167 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.167 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.167 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.167 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.168 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.168 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.168 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.168 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.168 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.168 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.169 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.169 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.169 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.169 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.169 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.169 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.170 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.170 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.170 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.170 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.170 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.170 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.170 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.171 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.171 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.171 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.171 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.171 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.171 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.172 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.172 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.172 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.172 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.172 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.172 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.173 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.173 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.173 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.173 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.173 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.173 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.174 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.174 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.174 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.174 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.174 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.174 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.175 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.175 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.175 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.175 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.175 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.175 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.176 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.176 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.176 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.176 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.176 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.176 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.176 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.177 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.177 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.177 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.177 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.177 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.177 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.178 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.178 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.178 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.178 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.178 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.178 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.179 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.179 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.179 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.179 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.179 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.179 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.179 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.180 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.180 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.180 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.180 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.180 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.181 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.181 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.181 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.181 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.181 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.181 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.181 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.182 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.182 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.182 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.182 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.182 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.182 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.183 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.183 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.183 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.183 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.183 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.183 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.184 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.184 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.184 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.184 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.184 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.184 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.185 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.185 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.185 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.185 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.185 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.185 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.186 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.186 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.186 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.186 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.186 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.186 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.187 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.187 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.187 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.187 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.187 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.187 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.187 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.188 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.188 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.188 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.188 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.188 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.189 186666 WARNING oslo_config.cfg [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 19 19:14:59 compute-0 nova_compute[186662]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 19 19:14:59 compute-0 nova_compute[186662]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 19 19:14:59 compute-0 nova_compute[186662]: and ``live_migration_inbound_addr`` respectively.
Feb 19 19:14:59 compute-0 nova_compute[186662]: ).  Its value may be silently ignored in the future.
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.189 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.189 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.189 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.189 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.189 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.190 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.190 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.190 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.190 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.190 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.190 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.191 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.191 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.191 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.191 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.191 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.191 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.191 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.192 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.192 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.192 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.192 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.192 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.192 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.193 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.193 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.193 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.193 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.193 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.193 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.194 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.194 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.194 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.194 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.194 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.194 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.195 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.195 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.195 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.195 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.195 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.195 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.196 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.196 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.196 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.196 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.196 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.196 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.197 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.197 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.197 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.197 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.197 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.198 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.198 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.198 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.198 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.198 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.198 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.199 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.199 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.199 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.199 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.199 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.199 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.199 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.200 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.200 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.200 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.200 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.200 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.200 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.201 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.201 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.201 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.201 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.201 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.201 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.201 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.202 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.202 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.202 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.202 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.202 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.202 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.203 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.203 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.203 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.203 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.203 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.203 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.204 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.204 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.204 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.204 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.204 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.204 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.205 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.205 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.205 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.205 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.205 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.205 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.205 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.206 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.206 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.206 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.206 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.206 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.206 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.207 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.207 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.207 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.207 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.207 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.207 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.208 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.208 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.208 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.208 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.208 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.208 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.209 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.209 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.209 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.209 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.209 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.209 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.210 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.210 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.210 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.210 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.210 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.210 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.211 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.211 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.211 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.211 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.211 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.211 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.212 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.212 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.212 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.212 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.212 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.212 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.213 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.213 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.213 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.213 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.213 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.213 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.214 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.214 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.214 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.214 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.214 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.214 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.214 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.215 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.215 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.215 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.215 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.215 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.215 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.216 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.216 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.216 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.216 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.216 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.216 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.217 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.217 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.217 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.217 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.217 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.217 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.218 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.218 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.218 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.218 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.218 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.218 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.219 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.219 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.219 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.219 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.219 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.220 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.220 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.220 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.220 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.220 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.220 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.221 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.221 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.221 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.221 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.221 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.221 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.222 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.222 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.222 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.222 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.222 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.222 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.223 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.223 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.223 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.223 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.223 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.223 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.224 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.224 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.224 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.224 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.224 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.224 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.225 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.225 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.225 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.225 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.225 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.225 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.226 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.226 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.226 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.226 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.226 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.226 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.226 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.227 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.227 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.227 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.227 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.227 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.227 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.228 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.228 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.228 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.228 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.228 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.228 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.228 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.229 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.229 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.229 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.229 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.229 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.229 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.230 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.230 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.230 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.230 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.230 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.230 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.231 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.231 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.231 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.231 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.231 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.231 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.232 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.232 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.232 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.232 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.232 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.232 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.233 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.233 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.233 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.233 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.233 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.234 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.234 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.234 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.234 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.235 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.235 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.235 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.235 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.235 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.235 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.235 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.236 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.236 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.236 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.236 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.236 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.237 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.237 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.237 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.237 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.237 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.237 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.238 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.238 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.238 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.238 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.238 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.238 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.239 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.239 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.239 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.239 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.239 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.239 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.239 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.240 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.240 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.240 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.240 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.240 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.241 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.241 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.241 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.241 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.241 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.241 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.241 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.242 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.242 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.242 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.242 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.242 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.242 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.243 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.243 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.243 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.243 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.243 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.243 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.244 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.244 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.244 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.244 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.244 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.245 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.245 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.245 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.245 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.245 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.245 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.246 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.246 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.246 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.246 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.246 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.246 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.246 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.247 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.247 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.247 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.247 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.247 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.247 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.248 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.248 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.248 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.248 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.248 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.248 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.249 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.249 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.249 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.249 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.249 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.249 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.250 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.250 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.250 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.250 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.250 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.250 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.250 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.251 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.251 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.251 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.251 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.251 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.251 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.252 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.252 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.252 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.252 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.252 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.252 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.253 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.253 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.253 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.253 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.253 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.253 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.253 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.254 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.254 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.254 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.254 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.254 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.254 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.255 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.255 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.255 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.255 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.255 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.255 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.256 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.256 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.256 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.256 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.256 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.256 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.257 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.257 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.257 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.257 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.257 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.257 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.258 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.258 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.258 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.258 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.258 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.258 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.259 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.259 186666 DEBUG oslo_service.backend._eventlet.service [None req-bcdab262-cc61-4f57-b05c-5aabdca31bae - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.259 186666 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.768 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.786 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f252b17f6e0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Feb 19 19:14:59 compute-0 nova_compute[186662]: libvirt:  error : internal error: could not initialize domain event timer
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.787 186666 WARNING nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.787 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f252b17f6e0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.789 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.790 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.790 186666 INFO nova.utils [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] The default thread pool MainProcess.default is initialized
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.790 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.791 186666 INFO nova.virt.libvirt.driver [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Connection event '1' reason 'None'
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.796 186666 INFO nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Libvirt host capabilities <capabilities>
Feb 19 19:14:59 compute-0 nova_compute[186662]: 
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <host>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <uuid>30a688d1-6db8-4679-8c84-eeb6a5211fb8</uuid>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <cpu>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <arch>x86_64</arch>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model>EPYC-Rome-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <vendor>AMD</vendor>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <microcode version='16777317'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <signature family='23' model='49' stepping='0'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='x2apic'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='tsc-deadline'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='osxsave'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='hypervisor'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='tsc_adjust'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='spec-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='stibp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='arch-capabilities'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='cmp_legacy'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='topoext'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='virt-ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='lbrv'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='tsc-scale'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='vmcb-clean'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='pause-filter'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='pfthreshold'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='svme-addr-chk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='rdctl-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='skip-l1dfl-vmentry'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='mds-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature name='pschange-mc-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <pages unit='KiB' size='4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <pages unit='KiB' size='2048'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <pages unit='KiB' size='1048576'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </cpu>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <power_management>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <suspend_mem/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <suspend_disk/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <suspend_hybrid/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </power_management>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <iommu support='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <migration_features>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <live/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <uri_transports>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <uri_transport>tcp</uri_transport>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <uri_transport>rdma</uri_transport>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </uri_transports>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </migration_features>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <topology>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <cells num='1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <cell id='0'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:           <memory unit='KiB'>7864292</memory>
Feb 19 19:14:59 compute-0 nova_compute[186662]:           <pages unit='KiB' size='4'>1966073</pages>
Feb 19 19:14:59 compute-0 nova_compute[186662]:           <pages unit='KiB' size='2048'>0</pages>
Feb 19 19:14:59 compute-0 nova_compute[186662]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 19 19:14:59 compute-0 nova_compute[186662]:           <distances>
Feb 19 19:14:59 compute-0 nova_compute[186662]:             <sibling id='0' value='10'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:           </distances>
Feb 19 19:14:59 compute-0 nova_compute[186662]:           <cpus num='8'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:           </cpus>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         </cell>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </cells>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </topology>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <cache>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </cache>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <secmodel>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model>selinux</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <doi>0</doi>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </secmodel>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <secmodel>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model>dac</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <doi>0</doi>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </secmodel>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </host>
Feb 19 19:14:59 compute-0 nova_compute[186662]: 
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <guest>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <os_type>hvm</os_type>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <arch name='i686'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <wordsize>32</wordsize>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <domain type='qemu'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <domain type='kvm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </arch>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <features>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <pae/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <nonpae/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <acpi default='on' toggle='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <apic default='on' toggle='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <cpuselection/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <deviceboot/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <disksnapshot default='on' toggle='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <externalSnapshot/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </features>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </guest>
Feb 19 19:14:59 compute-0 nova_compute[186662]: 
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <guest>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <os_type>hvm</os_type>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <arch name='x86_64'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <wordsize>64</wordsize>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <domain type='qemu'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <domain type='kvm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </arch>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <features>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <acpi default='on' toggle='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <apic default='on' toggle='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <cpuselection/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <deviceboot/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <disksnapshot default='on' toggle='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <externalSnapshot/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </features>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </guest>
Feb 19 19:14:59 compute-0 nova_compute[186662]: 
Feb 19 19:14:59 compute-0 nova_compute[186662]: </capabilities>
Feb 19 19:14:59 compute-0 nova_compute[186662]: 
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.801 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.805 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 19 19:14:59 compute-0 nova_compute[186662]: <domainCapabilities>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <path>/usr/libexec/qemu-kvm</path>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <domain>kvm</domain>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <arch>i686</arch>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <vcpu max='4096'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <iothreads supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <os supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <enum name='firmware'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <loader supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>rom</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pflash</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='readonly'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>yes</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>no</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='secure'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>no</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </loader>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </os>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <cpu>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='host-passthrough' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='hostPassthroughMigratable'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>on</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>off</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='maximum' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='maximumMigratable'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>on</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>off</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='host-model' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <vendor>AMD</vendor>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='x2apic'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='tsc-deadline'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='hypervisor'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='tsc_adjust'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='spec-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='stibp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='cmp_legacy'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='overflow-recov'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='succor'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='amd-ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='virt-ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='lbrv'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='tsc-scale'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='vmcb-clean'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='flushbyasid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='pause-filter'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='pfthreshold'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='svme-addr-chk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='disable' name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='custom' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='ClearwaterForest'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ddpd-u'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sha512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm3'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='ClearwaterForest-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ddpd-u'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sha512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm3'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cooperlake'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cooperlake-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cooperlake-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Dhyana-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Genoa'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Genoa-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Genoa-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='perfmon-v2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Turin'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='perfmon-v2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbpb'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Turin-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='perfmon-v2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbpb'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-128'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-256'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-128'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-256'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v6'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v7'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='IvyBridge'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='IvyBridge-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='IvyBridge-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='IvyBridge-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='KnightsMill'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512er'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512pf'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='KnightsMill-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512er'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512pf'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Opteron_G4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Opteron_G4-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Opteron_G5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tbm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Opteron_G5-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tbm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SierraForest'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SierraForest-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SierraForest-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SierraForest-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='athlon'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='athlon-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='core2duo'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='core2duo-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='coreduo'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='coreduo-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='n270'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='n270-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='phenom'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='phenom-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <memoryBacking supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <enum name='sourceType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>file</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>anonymous</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>memfd</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </memoryBacking>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <disk supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='diskDevice'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>disk</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>cdrom</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>floppy</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>lun</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='bus'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>fdc</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>scsi</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>usb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>sata</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio-transitional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio-non-transitional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <graphics supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vnc</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>egl-headless</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>dbus</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <video supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='modelType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vga</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>cirrus</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>none</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>bochs</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>ramfb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </video>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <hostdev supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='mode'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>subsystem</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='startupPolicy'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>default</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>mandatory</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>requisite</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>optional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='subsysType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>usb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pci</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>scsi</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='capsType'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='pciBackend'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </hostdev>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <rng supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio-transitional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio-non-transitional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendModel'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>random</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>egd</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>builtin</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <filesystem supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='driverType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>path</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>handle</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtiofs</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </filesystem>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <tpm supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>tpm-tis</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>tpm-crb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendModel'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>emulator</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>external</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendVersion'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>2.0</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </tpm>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <redirdev supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='bus'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>usb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </redirdev>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <channel supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pty</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>unix</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </channel>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <crypto supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>qemu</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendModel'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>builtin</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </crypto>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <interface supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>default</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>passt</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <panic supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>isa</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>hyperv</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </panic>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <console supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>null</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vc</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pty</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>dev</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>file</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pipe</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>stdio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>udp</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>tcp</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>unix</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>qemu-vdagent</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>dbus</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </console>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <features>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <gic supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <vmcoreinfo supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <genid supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <backingStoreInput supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <backup supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <async-teardown supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <s390-pv supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <ps2 supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <tdx supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <sev supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <sgx supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <hyperv supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='features'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>relaxed</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vapic</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>spinlocks</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vpindex</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>runtime</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>synic</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>stimer</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>reset</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vendor_id</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>frequencies</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>reenlightenment</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>tlbflush</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>ipi</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>avic</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>emsr_bitmap</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>xmm_input</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <defaults>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <spinlocks>4095</spinlocks>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <stimer_direct>on</stimer_direct>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <tlbflush_direct>on</tlbflush_direct>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <tlbflush_extended>on</tlbflush_extended>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </defaults>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </hyperv>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <launchSecurity supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </features>
Feb 19 19:14:59 compute-0 nova_compute[186662]: </domainCapabilities>
Feb 19 19:14:59 compute-0 nova_compute[186662]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.811 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 19 19:14:59 compute-0 nova_compute[186662]: <domainCapabilities>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <path>/usr/libexec/qemu-kvm</path>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <domain>kvm</domain>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <arch>i686</arch>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <vcpu max='240'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <iothreads supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <os supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <enum name='firmware'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <loader supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>rom</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pflash</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='readonly'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>yes</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>no</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='secure'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>no</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </loader>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </os>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <cpu>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='host-passthrough' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='hostPassthroughMigratable'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>on</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>off</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='maximum' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='maximumMigratable'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>on</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>off</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='host-model' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <vendor>AMD</vendor>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='x2apic'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='tsc-deadline'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='hypervisor'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='tsc_adjust'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='spec-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='stibp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='cmp_legacy'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='overflow-recov'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='succor'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='amd-ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='virt-ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='lbrv'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='tsc-scale'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='vmcb-clean'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='flushbyasid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='pause-filter'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='pfthreshold'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='svme-addr-chk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='disable' name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='custom' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='ClearwaterForest'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ddpd-u'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sha512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm3'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='ClearwaterForest-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ddpd-u'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sha512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm3'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cooperlake'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cooperlake-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cooperlake-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Dhyana-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Genoa'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Genoa-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Genoa-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='perfmon-v2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Turin'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='perfmon-v2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbpb'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Turin-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='perfmon-v2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbpb'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-128'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-256'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-128'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-256'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v6'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v7'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='IvyBridge'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='IvyBridge-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='IvyBridge-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='IvyBridge-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='KnightsMill'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512er'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512pf'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='KnightsMill-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512er'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512pf'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Opteron_G4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Opteron_G4-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Opteron_G5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tbm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Opteron_G5-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tbm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SierraForest'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SierraForest-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SierraForest-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SierraForest-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='athlon'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='athlon-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='core2duo'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='core2duo-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='coreduo'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='coreduo-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='n270'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='n270-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='phenom'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='phenom-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <memoryBacking supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <enum name='sourceType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>file</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>anonymous</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>memfd</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </memoryBacking>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <disk supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='diskDevice'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>disk</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>cdrom</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>floppy</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>lun</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='bus'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>ide</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>fdc</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>scsi</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>usb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>sata</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio-transitional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio-non-transitional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <graphics supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vnc</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>egl-headless</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>dbus</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <video supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='modelType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vga</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>cirrus</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>none</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>bochs</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>ramfb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </video>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <hostdev supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='mode'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>subsystem</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='startupPolicy'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>default</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>mandatory</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>requisite</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>optional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='subsysType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>usb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pci</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>scsi</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='capsType'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='pciBackend'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </hostdev>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <rng supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio-transitional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio-non-transitional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendModel'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>random</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>egd</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>builtin</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <filesystem supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='driverType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>path</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>handle</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtiofs</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </filesystem>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <tpm supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>tpm-tis</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>tpm-crb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendModel'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>emulator</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>external</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendVersion'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>2.0</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </tpm>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <redirdev supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='bus'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>usb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </redirdev>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <channel supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pty</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>unix</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </channel>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <crypto supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>qemu</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendModel'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>builtin</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </crypto>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <interface supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>default</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>passt</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <panic supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>isa</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>hyperv</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </panic>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <console supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>null</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vc</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pty</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>dev</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>file</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pipe</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>stdio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>udp</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>tcp</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>unix</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>qemu-vdagent</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>dbus</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </console>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <features>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <gic supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <vmcoreinfo supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <genid supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <backingStoreInput supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <backup supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <async-teardown supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <s390-pv supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <ps2 supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <tdx supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <sev supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <sgx supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <hyperv supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='features'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>relaxed</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vapic</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>spinlocks</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vpindex</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>runtime</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>synic</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>stimer</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>reset</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vendor_id</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>frequencies</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>reenlightenment</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>tlbflush</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>ipi</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>avic</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>emsr_bitmap</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>xmm_input</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <defaults>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <spinlocks>4095</spinlocks>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <stimer_direct>on</stimer_direct>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <tlbflush_direct>on</tlbflush_direct>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <tlbflush_extended>on</tlbflush_extended>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </defaults>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </hyperv>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <launchSecurity supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </features>
Feb 19 19:14:59 compute-0 nova_compute[186662]: </domainCapabilities>
Feb 19 19:14:59 compute-0 nova_compute[186662]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.854 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.857 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 19 19:14:59 compute-0 nova_compute[186662]: <domainCapabilities>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <path>/usr/libexec/qemu-kvm</path>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <domain>kvm</domain>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <arch>x86_64</arch>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <vcpu max='4096'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <iothreads supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <os supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <enum name='firmware'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>efi</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <loader supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>rom</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pflash</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='readonly'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>yes</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>no</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='secure'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>yes</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>no</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </loader>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </os>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <cpu>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='host-passthrough' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='hostPassthroughMigratable'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>on</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>off</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='maximum' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='maximumMigratable'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>on</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>off</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='host-model' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <vendor>AMD</vendor>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='x2apic'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='tsc-deadline'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='hypervisor'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='tsc_adjust'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='spec-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='stibp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='cmp_legacy'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='overflow-recov'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='succor'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='amd-ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='virt-ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='lbrv'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='tsc-scale'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='vmcb-clean'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='flushbyasid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='pause-filter'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='pfthreshold'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='svme-addr-chk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='disable' name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='custom' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='ClearwaterForest'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ddpd-u'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sha512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm3'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='ClearwaterForest-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ddpd-u'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sha512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm3'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cooperlake'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cooperlake-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cooperlake-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Dhyana-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Genoa'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Genoa-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Genoa-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='perfmon-v2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Turin'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='perfmon-v2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbpb'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Turin-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='perfmon-v2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbpb'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-128'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-256'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-128'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-256'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Haswell-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v6'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v7'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='IvyBridge'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='IvyBridge-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='IvyBridge-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='IvyBridge-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='KnightsMill'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512er'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512pf'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='KnightsMill-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-4fmaps'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-4vnniw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512er'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512pf'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Opteron_G4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Opteron_G4-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Opteron_G5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tbm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Opteron_G5-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tbm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SierraForest'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SierraForest-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SierraForest-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='SierraForest-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='athlon'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='athlon-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='core2duo'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='core2duo-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='coreduo'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='coreduo-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='n270'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='n270-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='phenom'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='phenom-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <memoryBacking supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <enum name='sourceType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>file</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>anonymous</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>memfd</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </memoryBacking>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <disk supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='diskDevice'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>disk</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>cdrom</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>floppy</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>lun</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='bus'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>fdc</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>scsi</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>usb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>sata</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio-transitional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio-non-transitional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <graphics supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vnc</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>egl-headless</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>dbus</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <video supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='modelType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vga</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>cirrus</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>none</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>bochs</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>ramfb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </video>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <hostdev supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='mode'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>subsystem</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='startupPolicy'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>default</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>mandatory</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>requisite</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>optional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='subsysType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>usb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pci</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>scsi</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='capsType'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='pciBackend'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </hostdev>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <rng supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio-transitional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtio-non-transitional</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendModel'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>random</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>egd</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>builtin</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <filesystem supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='driverType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>path</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>handle</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>virtiofs</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </filesystem>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <tpm supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>tpm-tis</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>tpm-crb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendModel'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>emulator</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>external</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendVersion'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>2.0</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </tpm>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <redirdev supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='bus'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>usb</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </redirdev>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <channel supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pty</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>unix</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </channel>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <crypto supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>qemu</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendModel'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>builtin</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </crypto>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <interface supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='backendType'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>default</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>passt</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <panic supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>isa</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>hyperv</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </panic>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <console supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>null</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vc</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pty</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>dev</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>file</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pipe</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>stdio</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>udp</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>tcp</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>unix</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>qemu-vdagent</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>dbus</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </console>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <features>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <gic supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <vmcoreinfo supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <genid supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <backingStoreInput supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <backup supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <async-teardown supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <s390-pv supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <ps2 supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <tdx supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <sev supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <sgx supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <hyperv supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='features'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>relaxed</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vapic</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>spinlocks</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vpindex</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>runtime</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>synic</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>stimer</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>reset</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>vendor_id</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>frequencies</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>reenlightenment</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>tlbflush</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>ipi</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>avic</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>emsr_bitmap</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>xmm_input</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <defaults>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <spinlocks>4095</spinlocks>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <stimer_direct>on</stimer_direct>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <tlbflush_direct>on</tlbflush_direct>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <tlbflush_extended>on</tlbflush_extended>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </defaults>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </hyperv>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <launchSecurity supported='no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </features>
Feb 19 19:14:59 compute-0 nova_compute[186662]: </domainCapabilities>
Feb 19 19:14:59 compute-0 nova_compute[186662]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Feb 19 19:14:59 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.918 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 19 19:14:59 compute-0 nova_compute[186662]: <domainCapabilities>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <path>/usr/libexec/qemu-kvm</path>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <domain>kvm</domain>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <arch>x86_64</arch>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <vcpu max='240'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <iothreads supported='yes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <os supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <enum name='firmware'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <loader supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>rom</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>pflash</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='readonly'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>yes</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>no</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='secure'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>no</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </loader>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   </os>
Feb 19 19:14:59 compute-0 nova_compute[186662]:   <cpu>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='host-passthrough' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='hostPassthroughMigratable'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>on</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>off</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='maximum' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <enum name='maximumMigratable'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>on</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <value>off</value>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='host-model' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <vendor>AMD</vendor>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='x2apic'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='tsc-deadline'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='hypervisor'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='tsc_adjust'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='spec-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='stibp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='cmp_legacy'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='overflow-recov'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='succor'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='amd-ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='virt-ssbd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='lbrv'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='tsc-scale'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='vmcb-clean'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='flushbyasid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='pause-filter'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='pfthreshold'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='svme-addr-chk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <feature policy='disable' name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:14:59 compute-0 nova_compute[186662]:     <mode name='custom' supported='yes'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Broadwell-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cascadelake-Server-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='ClearwaterForest'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ddpd-u'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sha512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm3'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='ClearwaterForest-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bhi-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ddpd-u'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sha512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm3'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sm4'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cooperlake'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cooperlake-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Cooperlake-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Denverton-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='Dhyana-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Genoa'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Genoa-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Genoa-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='perfmon-v2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Milan-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Rome-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Turin'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='perfmon-v2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbpb'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-Turin-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amd-psfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='auto-ibrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vp2intersect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fs-gs-base-ns'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibpb-brtype'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='no-nested-data-bp'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='null-sel-clr-base'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='perfmon-v2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbpb'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='srso-user-kernel-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='stibp-always-on'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-v4'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='EPYC-v5'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids-v1'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids-v2'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-128'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-256'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 19 19:14:59 compute-0 nova_compute[186662]:       <blockers model='GraniteRapids-v3'>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-128'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-256'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx10-512'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:14:59 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='prefetchiti'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Haswell'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Haswell-IBRS'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Haswell-noTSX'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Haswell-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Haswell-v2'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Haswell-v3'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Haswell-v4'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-noTSX'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v2'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v3'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v4'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v5'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v6'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Icelake-Server-v7'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='IvyBridge'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='IvyBridge-IBRS'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='IvyBridge-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='IvyBridge-v2'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='KnightsMill'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-4fmaps'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-4vnniw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512er'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512pf'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='KnightsMill-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-4fmaps'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-4vnniw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512er'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512pf'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Opteron_G4'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Opteron_G4-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Opteron_G5'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='tbm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Opteron_G5-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fma4'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='tbm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xop'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v2'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v3'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='SapphireRapids-v4'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-bf16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-int8'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='amx-tile'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-bf16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-fp16'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512-vpopcntdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bitalg'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512ifma'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vbmi2'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrc'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fzrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='la57'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='taa-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='tsx-ldtrk'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xfd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='SierraForest'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='SierraForest-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='SierraForest-v2'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='SierraForest-v3'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-ifma'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-ne-convert'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-vnni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx-vnni-int8'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='bhi-ctrl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='bus-lock-detect'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='cmpccxadd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fbsdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='fsrs'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ibrs-all'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='intel-psfd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ipred-ctrl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='lam'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='mcdt-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pbrsb-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='psdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rrsba-ctrl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='sbdr-ssdp-no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='serialize'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vaes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='vpclmulqdq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-IBRS'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v2'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v3'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Client-v4'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-IBRS'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v2'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='hle'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='rtm'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v3'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v4'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Skylake-Server-v5'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512bw'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512cd'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512dq'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512f'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='avx512vl'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='invpcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pcid'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='pku'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Snowridge'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='mpx'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v2'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v3'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='core-capability'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='split-lock-detect'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='Snowridge-v4'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='cldemote'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='erms'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='gfni'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdir64b'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='movdiri'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='xsaves'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='athlon'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='athlon-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='core2duo'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='core2duo-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='coreduo'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='coreduo-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='n270'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='n270-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='ss'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='phenom'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <blockers model='phenom-v1'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='3dnow'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <feature name='3dnowext'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </blockers>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </mode>
Feb 19 19:15:00 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:15:00 compute-0 nova_compute[186662]:   <memoryBacking supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <enum name='sourceType'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <value>file</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <value>anonymous</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <value>memfd</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:   </memoryBacking>
Feb 19 19:15:00 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <disk supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='diskDevice'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>disk</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>cdrom</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>floppy</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>lun</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='bus'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>ide</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>fdc</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>scsi</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>usb</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>sata</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>virtio-transitional</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>virtio-non-transitional</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <graphics supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>vnc</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>egl-headless</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>dbus</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <video supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='modelType'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>vga</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>cirrus</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>none</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>bochs</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>ramfb</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </video>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <hostdev supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='mode'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>subsystem</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='startupPolicy'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>default</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>mandatory</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>requisite</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>optional</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='subsysType'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>usb</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>pci</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>scsi</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='capsType'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='pciBackend'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </hostdev>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <rng supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>virtio</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>virtio-transitional</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>virtio-non-transitional</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='backendModel'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>random</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>egd</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>builtin</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <filesystem supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='driverType'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>path</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>handle</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>virtiofs</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </filesystem>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <tpm supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>tpm-tis</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>tpm-crb</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='backendModel'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>emulator</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>external</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='backendVersion'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>2.0</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </tpm>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <redirdev supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='bus'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>usb</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </redirdev>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <channel supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>pty</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>unix</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </channel>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <crypto supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='model'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>qemu</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='backendModel'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>builtin</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </crypto>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <interface supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='backendType'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>default</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>passt</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <panic supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='model'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>isa</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>hyperv</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </panic>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <console supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='type'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>null</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>vc</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>pty</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>dev</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>file</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>pipe</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>stdio</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>udp</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>tcp</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>unix</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>qemu-vdagent</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>dbus</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </console>
Feb 19 19:15:00 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:15:00 compute-0 nova_compute[186662]:   <features>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <gic supported='no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <vmcoreinfo supported='yes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <genid supported='yes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <backingStoreInput supported='yes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <backup supported='yes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <async-teardown supported='yes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <s390-pv supported='no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <ps2 supported='yes'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <tdx supported='no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <sev supported='no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <sgx supported='no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <hyperv supported='yes'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <enum name='features'>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>relaxed</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>vapic</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>spinlocks</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>vpindex</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>runtime</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>synic</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>stimer</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>reset</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>vendor_id</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>frequencies</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>reenlightenment</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>tlbflush</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>ipi</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>avic</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>emsr_bitmap</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <value>xmm_input</value>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </enum>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       <defaults>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <spinlocks>4095</spinlocks>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <stimer_direct>on</stimer_direct>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <tlbflush_direct>on</tlbflush_direct>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <tlbflush_extended>on</tlbflush_extended>
Feb 19 19:15:00 compute-0 nova_compute[186662]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 19 19:15:00 compute-0 nova_compute[186662]:       </defaults>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     </hyperv>
Feb 19 19:15:00 compute-0 nova_compute[186662]:     <launchSecurity supported='no'/>
Feb 19 19:15:00 compute-0 nova_compute[186662]:   </features>
Feb 19 19:15:00 compute-0 nova_compute[186662]: </domainCapabilities>
Feb 19 19:15:00 compute-0 nova_compute[186662]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Feb 19 19:15:00 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.974 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Feb 19 19:15:00 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.975 186666 INFO nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Secure Boot support detected
Feb 19 19:15:00 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.979 186666 INFO nova.virt.libvirt.driver [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 19 19:15:00 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.979 186666 INFO nova.virt.libvirt.driver [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 19 19:15:00 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.990 186666 DEBUG nova.virt.libvirt.driver [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] cpu compare xml: <cpu match="exact">
Feb 19 19:15:00 compute-0 nova_compute[186662]:   <model>Nehalem</model>
Feb 19 19:15:00 compute-0 nova_compute[186662]: </cpu>
Feb 19 19:15:00 compute-0 nova_compute[186662]:  _compare_cpu /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10922
Feb 19 19:15:00 compute-0 nova_compute[186662]: 2026-02-19 19:14:59.991 186666 DEBUG nova.virt.libvirt.driver [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Feb 19 19:15:00 compute-0 nova_compute[186662]: 2026-02-19 19:15:00.296 186666 WARNING nova.virt.libvirt.driver [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 19 19:15:00 compute-0 nova_compute[186662]: 2026-02-19 19:15:00.297 186666 DEBUG nova.virt.libvirt.volume.mount [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 19 19:15:00 compute-0 nova_compute[186662]: 2026-02-19 19:15:00.500 186666 INFO nova.virt.node [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Determined node identity 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 from /var/lib/nova/compute_id
Feb 19 19:15:00 compute-0 rsyslogd[1019]: imjournal from <np0005624716:nova_compute>: begin to drop messages due to rate-limiting
Feb 19 19:15:01 compute-0 nova_compute[186662]: 2026-02-19 19:15:01.010 186666 WARNING nova.compute.manager [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Compute nodes ['11ecaf50-b8a2-48b5-a41c-a8b0b10798d6'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Feb 19 19:15:02 compute-0 nova_compute[186662]: 2026-02-19 19:15:02.022 186666 INFO nova.compute.manager [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 19 19:15:03 compute-0 sshd-session[186968]: Accepted publickey for zuul from 192.168.122.30 port 34738 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:15:03 compute-0 systemd-logind[822]: New session 26 of user zuul.
Feb 19 19:15:03 compute-0 systemd[1]: Started Session 26 of User zuul.
Feb 19 19:15:03 compute-0 sshd-session[186968]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:15:03 compute-0 nova_compute[186662]: 2026-02-19 19:15:03.062 186666 WARNING nova.compute.manager [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 19 19:15:03 compute-0 nova_compute[186662]: 2026-02-19 19:15:03.063 186666 DEBUG oslo_concurrency.lockutils [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:15:03 compute-0 nova_compute[186662]: 2026-02-19 19:15:03.063 186666 DEBUG oslo_concurrency.lockutils [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:15:03 compute-0 nova_compute[186662]: 2026-02-19 19:15:03.063 186666 DEBUG oslo_concurrency.lockutils [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:15:03 compute-0 nova_compute[186662]: 2026-02-19 19:15:03.063 186666 DEBUG nova.compute.resource_tracker [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:15:03 compute-0 nova_compute[186662]: 2026-02-19 19:15:03.193 186666 WARNING nova.virt.libvirt.driver [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:15:03 compute-0 nova_compute[186662]: 2026-02-19 19:15:03.195 186666 DEBUG oslo_concurrency.processutils [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:15:03 compute-0 nova_compute[186662]: 2026-02-19 19:15:03.208 186666 DEBUG oslo_concurrency.processutils [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:15:03 compute-0 nova_compute[186662]: 2026-02-19 19:15:03.208 186666 DEBUG nova.compute.resource_tracker [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6210MB free_disk=73.20130157470703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:15:03 compute-0 nova_compute[186662]: 2026-02-19 19:15:03.208 186666 DEBUG oslo_concurrency.lockutils [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:15:03 compute-0 nova_compute[186662]: 2026-02-19 19:15:03.208 186666 DEBUG oslo_concurrency.lockutils [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:15:03 compute-0 nova_compute[186662]: 2026-02-19 19:15:03.714 186666 WARNING nova.compute.resource_tracker [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] No compute node record for compute-0.ctlplane.example.com:11ecaf50-b8a2-48b5-a41c-a8b0b10798d6: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 could not be found.
Feb 19 19:15:04 compute-0 python3.9[187122]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 19 19:15:04 compute-0 nova_compute[186662]: 2026-02-19 19:15:04.220 186666 INFO nova.compute.resource_tracker [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6
Feb 19 19:15:04 compute-0 sshd-session[186966]: Invalid user oracle from 27.50.25.190 port 41580
Feb 19 19:15:04 compute-0 sshd-session[186964]: Invalid user ubuntu from 182.75.216.74 port 38100
Feb 19 19:15:04 compute-0 sshd-session[186966]: Received disconnect from 27.50.25.190 port 41580:11: Bye Bye [preauth]
Feb 19 19:15:04 compute-0 sshd-session[186966]: Disconnected from invalid user oracle 27.50.25.190 port 41580 [preauth]
Feb 19 19:15:04 compute-0 sshd-session[186964]: Received disconnect from 182.75.216.74 port 38100:11: Bye Bye [preauth]
Feb 19 19:15:04 compute-0 sshd-session[186964]: Disconnected from invalid user ubuntu 182.75.216.74 port 38100 [preauth]
Feb 19 19:15:05 compute-0 sudo[187276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mayjicfzhzbgopfnemougstcehmlvjhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528504.66983-47-82876414243933/AnsiballZ_systemd_service.py'
Feb 19 19:15:05 compute-0 sudo[187276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:05 compute-0 python3.9[187279]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 19 19:15:05 compute-0 systemd[1]: Reloading.
Feb 19 19:15:05 compute-0 systemd-rc-local-generator[187305]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:15:05 compute-0 systemd-sysv-generator[187310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:15:05 compute-0 sudo[187276]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:05 compute-0 nova_compute[186662]: 2026-02-19 19:15:05.791 186666 DEBUG nova.compute.resource_tracker [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:15:05 compute-0 nova_compute[186662]: 2026-02-19 19:15:05.792 186666 DEBUG nova.compute.resource_tracker [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:15:03 up 46 min,  0 user,  load average: 0.57, 0.67, 0.54\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:15:06 compute-0 nova_compute[186662]: 2026-02-19 19:15:06.224 186666 INFO nova.scheduler.client.report [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] [req-16d2bffe-5d4c-4014-98cf-e0dc3bc697f5] Created resource provider record via placement API for resource provider with UUID 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 and name compute-0.ctlplane.example.com.
Feb 19 19:15:06 compute-0 nova_compute[186662]: 2026-02-19 19:15:06.242 186666 DEBUG nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 19 19:15:06 compute-0 nova_compute[186662]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Feb 19 19:15:06 compute-0 nova_compute[186662]: 2026-02-19 19:15:06.243 186666 INFO nova.virt.libvirt.host [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] kernel doesn't support AMD SEV
Feb 19 19:15:06 compute-0 nova_compute[186662]: 2026-02-19 19:15:06.243 186666 DEBUG nova.compute.provider_tree [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:15:06 compute-0 nova_compute[186662]: 2026-02-19 19:15:06.243 186666 DEBUG nova.virt.libvirt.driver [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:15:06 compute-0 nova_compute[186662]: 2026-02-19 19:15:06.245 186666 DEBUG nova.virt.libvirt.driver [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Libvirt baseline CPU <cpu>
Feb 19 19:15:06 compute-0 nova_compute[186662]:   <arch>x86_64</arch>
Feb 19 19:15:06 compute-0 nova_compute[186662]:   <model>Nehalem</model>
Feb 19 19:15:06 compute-0 nova_compute[186662]:   <vendor>AMD</vendor>
Feb 19 19:15:06 compute-0 nova_compute[186662]:   <topology sockets="8" cores="1" threads="1"/>
Feb 19 19:15:06 compute-0 nova_compute[186662]:   <maxphysaddr mode="emulate" bits="40"/>
Feb 19 19:15:06 compute-0 nova_compute[186662]: </cpu>
Feb 19 19:15:06 compute-0 nova_compute[186662]:  _get_guest_baseline_cpu_features /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13545
Feb 19 19:15:06 compute-0 podman[187398]: 2026-02-19 19:15:06.275889601 +0000 UTC m=+0.051441220 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 19 19:15:06 compute-0 python3.9[187490]: ansible-ansible.builtin.service_facts Invoked
Feb 19 19:15:06 compute-0 network[187507]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 19 19:15:06 compute-0 network[187508]: 'network-scripts' will be removed from distribution in near future.
Feb 19 19:15:06 compute-0 network[187509]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 19 19:15:06 compute-0 nova_compute[186662]: 2026-02-19 19:15:06.783 186666 DEBUG nova.scheduler.client.report [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Updated inventory for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Feb 19 19:15:06 compute-0 nova_compute[186662]: 2026-02-19 19:15:06.784 186666 DEBUG nova.compute.provider_tree [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Updating resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Feb 19 19:15:06 compute-0 nova_compute[186662]: 2026-02-19 19:15:06.784 186666 DEBUG nova.compute.provider_tree [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:15:06 compute-0 nova_compute[186662]: 2026-02-19 19:15:06.894 186666 DEBUG nova.compute.provider_tree [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Updating resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Feb 19 19:15:07 compute-0 nova_compute[186662]: 2026-02-19 19:15:07.401 186666 DEBUG nova.compute.resource_tracker [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:15:07 compute-0 nova_compute[186662]: 2026-02-19 19:15:07.401 186666 DEBUG oslo_concurrency.lockutils [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.193s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:15:07 compute-0 nova_compute[186662]: 2026-02-19 19:15:07.401 186666 DEBUG nova.service [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Feb 19 19:15:07 compute-0 nova_compute[186662]: 2026-02-19 19:15:07.495 186666 DEBUG nova.service [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Feb 19 19:15:07 compute-0 nova_compute[186662]: 2026-02-19 19:15:07.496 186666 DEBUG nova.servicegroup.drivers.db [None req-42e53f99-80a4-4177-bb91-a72ea9cce866 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Feb 19 19:15:10 compute-0 sudo[187780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtuhsfcnojcpankujnsisdnawkftjhnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528510.6284735-85-141043504448222/AnsiballZ_systemd_service.py'
Feb 19 19:15:10 compute-0 sudo[187780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:11 compute-0 python3.9[187783]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:15:11 compute-0 sudo[187780]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:11 compute-0 podman[187785]: 2026-02-19 19:15:11.313499578 +0000 UTC m=+0.107167519 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 19 19:15:11 compute-0 sudo[187961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mavoczugsnioqwomgzchtevdheeoxfoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528511.5077045-105-99795134713903/AnsiballZ_file.py'
Feb 19 19:15:11 compute-0 sudo[187961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:12 compute-0 python3.9[187964]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:12 compute-0 sudo[187961]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:12 compute-0 rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 19:15:12 compute-0 sudo[188115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgqfschxzvhmvitoahbooxbzvbtozfih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528512.4702702-121-159412234170842/AnsiballZ_file.py'
Feb 19 19:15:12 compute-0 sudo[188115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:12 compute-0 python3.9[188118]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:12 compute-0 sudo[188115]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:13 compute-0 sudo[188268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zystarudfvrytblbazasnzgobrhiesdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528513.1815932-139-243573546352356/AnsiballZ_command.py'
Feb 19 19:15:13 compute-0 sudo[188268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:13 compute-0 python3.9[188271]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:15:13 compute-0 sudo[188268]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:14 compute-0 python3.9[188423]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 19 19:15:15 compute-0 sudo[188573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvjoafibnpjwdqeyawvyhxdiigoxdsdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528514.7963424-175-57489271628892/AnsiballZ_systemd_service.py'
Feb 19 19:15:15 compute-0 sudo[188573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:15 compute-0 python3.9[188576]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 19 19:15:15 compute-0 systemd[1]: Reloading.
Feb 19 19:15:15 compute-0 systemd-rc-local-generator[188600]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:15:15 compute-0 systemd-sysv-generator[188606]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:15:15 compute-0 sudo[188573]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:16 compute-0 sudo[188767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzmzlklwjycvoxalbmazbkwysmudcora ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528515.8042567-191-87373774395439/AnsiballZ_command.py'
Feb 19 19:15:16 compute-0 sudo[188767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:16 compute-0 python3.9[188770]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:15:16 compute-0 sudo[188767]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:16 compute-0 sudo[188921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqwwvympfecddvxnzauzizqtkhoysbjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528516.6612463-209-205803119596041/AnsiballZ_file.py'
Feb 19 19:15:16 compute-0 sudo[188921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:17 compute-0 python3.9[188924]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:15:17 compute-0 sudo[188921]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:17 compute-0 python3.9[189074]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:15:18 compute-0 sudo[189226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slodwxmlxwxksamqrmmtezrnxyosnyyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528518.117976-241-47201744616592/AnsiballZ_group.py'
Feb 19 19:15:18 compute-0 sudo[189226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:18 compute-0 python3.9[189229]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Feb 19 19:15:18 compute-0 sudo[189226]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:19 compute-0 sudo[189379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxokujbyewmtgceexxvkaqttuwmoidcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528519.0537477-263-23886926735805/AnsiballZ_getent.py'
Feb 19 19:15:19 compute-0 sudo[189379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:19 compute-0 python3.9[189382]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Feb 19 19:15:19 compute-0 sudo[189379]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:20 compute-0 sudo[189533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvjidjjmqdxbylyfdinymqqhaukwbugj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528519.8721783-279-221048707196092/AnsiballZ_group.py'
Feb 19 19:15:20 compute-0 sudo[189533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:20 compute-0 python3.9[189536]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 19 19:15:20 compute-0 groupadd[189537]: group added to /etc/group: name=ceilometer, GID=42405
Feb 19 19:15:20 compute-0 groupadd[189537]: group added to /etc/gshadow: name=ceilometer
Feb 19 19:15:20 compute-0 groupadd[189537]: new group: name=ceilometer, GID=42405
Feb 19 19:15:20 compute-0 sudo[189533]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:21 compute-0 sudo[189692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgaswoeuzpiqfvvxakpeskapefvlwmfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528520.6430132-295-83212535544657/AnsiballZ_user.py'
Feb 19 19:15:21 compute-0 sudo[189692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:21 compute-0 python3.9[189695]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 19 19:15:21 compute-0 useradd[189697]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/1
Feb 19 19:15:21 compute-0 useradd[189697]: add 'ceilometer' to group 'libvirt'
Feb 19 19:15:21 compute-0 useradd[189697]: add 'ceilometer' to shadow group 'libvirt'
Feb 19 19:15:21 compute-0 sudo[189692]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:22 compute-0 python3.9[189853]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:15:23 compute-0 python3.9[189974]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771528522.3849547-347-33342715005424/.source.conf _original_basename=ceilometer.conf follow=False checksum=5c6a9288d15d1b05b1484826ce363ad306e9930c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:23 compute-0 python3.9[190124]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:15:24 compute-0 python3.9[190245]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771528523.5227294-347-242625803272111/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:25 compute-0 python3.9[190395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:15:25 compute-0 python3.9[190516]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771528524.5475166-347-54906044276085/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:26 compute-0 python3.9[190666]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:15:26 compute-0 python3.9[190818]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:15:27 compute-0 python3.9[190970]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:15:27 compute-0 python3.9[191091]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528526.9549637-465-127878608972831/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:15:28 compute-0 python3.9[191241]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:15:28 compute-0 python3.9[191362]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528527.8844984-465-205830435698272/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:15:29 compute-0 python3.9[191512]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:15:29 compute-0 python3.9[191633]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528529.1706994-523-31514884739116/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:15:30 compute-0 python3.9[191783]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:15:31 compute-0 python3.9[191904]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528530.2674334-555-9427141393651/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:31 compute-0 python3.9[192054]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:15:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:15:32.104 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:15:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:15:32.105 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:15:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:15:32.106 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:15:32 compute-0 python3.9[192175]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528531.442823-585-267158392482162/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:32 compute-0 python3.9[192326]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:15:33 compute-0 python3.9[192447]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528532.5455894-615-92447647129986/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:33 compute-0 sudo[192597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twtkoaymuybwvomyllmggptabvyxzpfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528533.7319813-645-109819015409551/AnsiballZ_file.py'
Feb 19 19:15:33 compute-0 sudo[192597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:34 compute-0 python3.9[192600]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:34 compute-0 sudo[192597]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:34 compute-0 sudo[192750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxxxxcwvtyotddngxzftkkhidrnnxokh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528534.3739994-661-90759147265706/AnsiballZ_file.py'
Feb 19 19:15:34 compute-0 sudo[192750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:34 compute-0 python3.9[192753]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:34 compute-0 sudo[192750]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:35 compute-0 python3.9[192903]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:15:36 compute-0 python3.9[193055]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:15:36 compute-0 podman[193181]: 2026-02-19 19:15:36.583225792 +0000 UTC m=+0.040341824 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Feb 19 19:15:36 compute-0 python3.9[193220]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:15:37 compute-0 sudo[193378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kigcjmemjxgaqqjnujnzbnohpwlmpklz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528536.9191692-725-12303671083670/AnsiballZ_file.py'
Feb 19 19:15:37 compute-0 sudo[193378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:37 compute-0 python3.9[193381]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:15:37 compute-0 sudo[193378]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:37 compute-0 sudo[193531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scsirftwhfmqdrzwxqndkokuebgjfcro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528537.5011878-741-214784989914925/AnsiballZ_systemd_service.py'
Feb 19 19:15:37 compute-0 sudo[193531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:37 compute-0 python3.9[193534]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:15:37 compute-0 systemd[1]: Reloading.
Feb 19 19:15:38 compute-0 systemd-rc-local-generator[193558]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:15:38 compute-0 systemd-sysv-generator[193567]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:15:38 compute-0 systemd[1]: Listening on Podman API Socket.
Feb 19 19:15:38 compute-0 sudo[193531]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:38 compute-0 sudo[193731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvoppbyzmhnbxioigebccofoiprrbtcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528538.5611434-759-82972839887293/AnsiballZ_stat.py'
Feb 19 19:15:38 compute-0 sudo[193731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:38 compute-0 python3.9[193734]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:15:38 compute-0 sudo[193731]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:39 compute-0 sudo[193855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xefkjcxvzajmtyoyvxckvetgmndddqsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528538.5611434-759-82972839887293/AnsiballZ_copy.py'
Feb 19 19:15:39 compute-0 sudo[193855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:39 compute-0 python3.9[193858]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528538.5611434-759-82972839887293/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:15:39 compute-0 sudo[193855]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:40 compute-0 sudo[194008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boldlhggubmljuvzhkrmvsdaoipwdyzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528540.1156957-801-32191787778964/AnsiballZ_file.py'
Feb 19 19:15:40 compute-0 sudo[194008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:40 compute-0 python3.9[194011]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:40 compute-0 sudo[194008]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:41 compute-0 sudo[194161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjcevxxxiipbrczchcpyjcvhqkgwxqvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528540.9235992-817-55269515181346/AnsiballZ_file.py'
Feb 19 19:15:41 compute-0 sudo[194161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:41 compute-0 python3.9[194164]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:15:41 compute-0 sudo[194161]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:41 compute-0 podman[194165]: 2026-02-19 19:15:41.53270624 +0000 UTC m=+0.122828309 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:15:42 compute-0 python3.9[194341]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:43 compute-0 sudo[194762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wukpjdoeprhbetotaunayorknclqrbid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528543.4338255-885-264197517174197/AnsiballZ_container_config_data.py'
Feb 19 19:15:43 compute-0 sudo[194762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:43 compute-0 python3.9[194765]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Feb 19 19:15:44 compute-0 sudo[194762]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:44 compute-0 sudo[194915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqlvktpkiqphntuspfptlrztiurcswno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528544.3407726-907-192982434185251/AnsiballZ_container_config_hash.py'
Feb 19 19:15:44 compute-0 sudo[194915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:44 compute-0 python3.9[194918]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 19 19:15:44 compute-0 sudo[194915]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:45 compute-0 sudo[195068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shwdnynwkerfqreabbadjzlhkfirxtcc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771528545.201275-927-19985541577052/AnsiballZ_edpm_container_manage.py'
Feb 19 19:15:45 compute-0 sudo[195068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:46 compute-0 python3[195071]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 19 19:15:47 compute-0 podman[195085]: 2026-02-19 19:15:47.457692894 +0000 UTC m=+1.275860163 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 19 19:15:47 compute-0 podman[195181]: 2026-02-19 19:15:47.561739798 +0000 UTC m=+0.044827858 container create a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:15:47 compute-0 podman[195181]: 2026-02-19 19:15:47.535585292 +0000 UTC m=+0.018673432 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 19 19:15:47 compute-0 python3[195071]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Feb 19 19:15:47 compute-0 sudo[195068]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:48 compute-0 sudo[195369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvibiknpxgbcssvmaoreylcbktkeghcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528547.8596964-943-182866110054158/AnsiballZ_stat.py'
Feb 19 19:15:48 compute-0 sudo[195369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:48 compute-0 python3.9[195372]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:15:48 compute-0 sudo[195369]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:48 compute-0 sudo[195524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqnxguwgtxjpmmcirugqgqtbpdlvlygn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528548.6203787-961-188510686657736/AnsiballZ_file.py'
Feb 19 19:15:48 compute-0 sudo[195524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:48 compute-0 python3.9[195527]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:48 compute-0 sudo[195524]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:49 compute-0 sudo[195601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggvnsljsonnpxilhsukdhbkqiwuclgaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528548.6203787-961-188510686657736/AnsiballZ_stat.py'
Feb 19 19:15:49 compute-0 sudo[195601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:49 compute-0 python3.9[195604]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:15:49 compute-0 sudo[195601]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:49 compute-0 sudo[195753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyumcggckmrohihhaitaqeyncefeeiut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528549.4177606-961-241985248655027/AnsiballZ_copy.py'
Feb 19 19:15:49 compute-0 sudo[195753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:50 compute-0 python3.9[195756]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771528549.4177606-961-241985248655027/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:50 compute-0 sudo[195753]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:50 compute-0 sudo[195830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivmlhfctlrdkaressvqwxbplfsbahlxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528549.4177606-961-241985248655027/AnsiballZ_systemd.py'
Feb 19 19:15:50 compute-0 sudo[195830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:50 compute-0 python3.9[195833]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 19 19:15:50 compute-0 systemd[1]: Reloading.
Feb 19 19:15:50 compute-0 systemd-rc-local-generator[195857]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:15:50 compute-0 systemd-sysv-generator[195860]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:15:51 compute-0 sudo[195830]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:51 compute-0 sudo[195949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bemcjqetbjoqksfukafgqxhdzvrvrmnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528549.4177606-961-241985248655027/AnsiballZ_systemd.py'
Feb 19 19:15:51 compute-0 sudo[195949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:51 compute-0 python3.9[195952]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:15:51 compute-0 systemd[1]: Reloading.
Feb 19 19:15:51 compute-0 systemd-sysv-generator[195979]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:15:51 compute-0 systemd-rc-local-generator[195975]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:15:51 compute-0 systemd[1]: Starting podman_exporter container...
Feb 19 19:15:52 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:15:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14b6ac238d9002bced59feeceb2f5e27bf507afe1b31a9809382148dfb3f042c/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 19 19:15:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14b6ac238d9002bced59feeceb2f5e27bf507afe1b31a9809382148dfb3f042c/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 19 19:15:52 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9.
Feb 19 19:15:52 compute-0 podman[195999]: 2026-02-19 19:15:52.078789283 +0000 UTC m=+0.125275226 container init a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:15:52 compute-0 podman_exporter[196014]: ts=2026-02-19T19:15:52.097Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Feb 19 19:15:52 compute-0 podman_exporter[196014]: ts=2026-02-19T19:15:52.097Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Feb 19 19:15:52 compute-0 podman_exporter[196014]: ts=2026-02-19T19:15:52.097Z caller=handler.go:94 level=info msg="enabled collectors"
Feb 19 19:15:52 compute-0 podman_exporter[196014]: ts=2026-02-19T19:15:52.097Z caller=handler.go:105 level=info collector=container
Feb 19 19:15:52 compute-0 podman[195999]: 2026-02-19 19:15:52.102079153 +0000 UTC m=+0.148565046 container start a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:15:52 compute-0 systemd[1]: Starting Podman API Service...
Feb 19 19:15:52 compute-0 podman[195999]: podman_exporter
Feb 19 19:15:52 compute-0 systemd[1]: Started Podman API Service.
Feb 19 19:15:52 compute-0 systemd[1]: Started podman_exporter container.
Feb 19 19:15:52 compute-0 sudo[195949]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:52 compute-0 podman[196025]: time="2026-02-19T19:15:52Z" level=info msg="/usr/bin/podman filtering at log level info"
Feb 19 19:15:52 compute-0 podman[196025]: time="2026-02-19T19:15:52Z" level=info msg="Setting parallel job count to 25"
Feb 19 19:15:52 compute-0 podman[196025]: time="2026-02-19T19:15:52Z" level=info msg="Using sqlite as database backend"
Feb 19 19:15:52 compute-0 podman[196025]: time="2026-02-19T19:15:52Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Feb 19 19:15:52 compute-0 podman[196025]: time="2026-02-19T19:15:52Z" level=info msg="Using systemd socket activation to determine API endpoint"
Feb 19 19:15:52 compute-0 podman[196025]: time="2026-02-19T19:15:52Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Feb 19 19:15:52 compute-0 podman[196025]: @ - - [19/Feb/2026:19:15:52 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Feb 19 19:15:52 compute-0 podman[196025]: time="2026-02-19T19:15:52Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:15:52 compute-0 podman[196025]: @ - - [19/Feb/2026:19:15:52 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 12571 "" "Go-http-client/1.1"
Feb 19 19:15:52 compute-0 podman_exporter[196014]: ts=2026-02-19T19:15:52.187Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Feb 19 19:15:52 compute-0 podman_exporter[196014]: ts=2026-02-19T19:15:52.188Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Feb 19 19:15:52 compute-0 podman_exporter[196014]: ts=2026-02-19T19:15:52.189Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Feb 19 19:15:52 compute-0 podman[196023]: 2026-02-19 19:15:52.198520769 +0000 UTC m=+0.088326585 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:15:52 compute-0 systemd[1]: a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9-3aa1463ad2934fa9.service: Main process exited, code=exited, status=1/FAILURE
Feb 19 19:15:52 compute-0 systemd[1]: a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9-3aa1463ad2934fa9.service: Failed with result 'exit-code'.
Feb 19 19:15:53 compute-0 python3.9[196209]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 19 19:15:54 compute-0 sudo[196359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctprpsjydlptaxzgnfdkqpihijswftci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528553.980683-1051-143260115555497/AnsiballZ_stat.py'
Feb 19 19:15:54 compute-0 sudo[196359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:54 compute-0 python3.9[196362]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:15:54 compute-0 sudo[196359]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:54 compute-0 sudo[196485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzmfzntnktymbbiqnpvyqzogbecrmgbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528553.980683-1051-143260115555497/AnsiballZ_copy.py'
Feb 19 19:15:54 compute-0 sudo[196485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:54 compute-0 python3.9[196488]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528553.980683-1051-143260115555497/.source.yaml _original_basename=.w70zutvh follow=False checksum=b790d0b3757345bf84fe8a964029d6da593b58c9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:54 compute-0 sudo[196485]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:55 compute-0 sudo[196638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzclewiaykwguxmmrenqjgaochczpokq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528555.2616305-1081-7339454416973/AnsiballZ_stat.py'
Feb 19 19:15:55 compute-0 sudo[196638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:55 compute-0 python3.9[196641]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:15:55 compute-0 sudo[196638]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:55 compute-0 sudo[196762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdkvqpehuztyycngxglupoeiuqnbsjnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528555.2616305-1081-7339454416973/AnsiballZ_copy.py'
Feb 19 19:15:55 compute-0 sudo[196762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:56 compute-0 python3.9[196765]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771528555.2616305-1081-7339454416973/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:15:56 compute-0 sudo[196762]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:56 compute-0 sudo[196915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlvkvntiuqomghbtiyyosvktbesiyhpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528556.7781115-1123-43650259582264/AnsiballZ_file.py'
Feb 19 19:15:56 compute-0 sudo[196915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:57 compute-0 python3.9[196918]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:15:57 compute-0 sudo[196915]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:57 compute-0 sudo[197068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvrgxtmhsekdwzzpwtsqtvatqfhfhfmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528557.4104776-1139-56804335621833/AnsiballZ_file.py'
Feb 19 19:15:57 compute-0 sudo[197068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:15:57 compute-0 python3.9[197071]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 19 19:15:57 compute-0 sudo[197068]: pam_unix(sudo:session): session closed for user root
Feb 19 19:15:58 compute-0 python3.9[197221]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:00 compute-0 sudo[197642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlwvizxhjzrkzxqmohxepjhkqxlcefjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528559.971952-1207-240731701411998/AnsiballZ_container_config_data.py'
Feb 19 19:16:00 compute-0 sudo[197642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:00 compute-0 python3.9[197645]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Feb 19 19:16:00 compute-0 sudo[197642]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:01 compute-0 sudo[197795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgvvtrkgopalhdzidtapbtnlcercdxpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528560.8936179-1229-203348990773798/AnsiballZ_container_config_hash.py'
Feb 19 19:16:01 compute-0 sudo[197795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:01 compute-0 python3.9[197798]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 19 19:16:01 compute-0 sudo[197795]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:01 compute-0 sudo[197948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecuomtfcqztptzvhhumnkedymavklwxv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771528561.6351972-1249-280770526185468/AnsiballZ_edpm_container_manage.py'
Feb 19 19:16:01 compute-0 sudo[197948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:02 compute-0 python3[197951]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 19 19:16:02 compute-0 nova_compute[186662]: 2026-02-19 19:16:02.497 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:16:02 compute-0 nova_compute[186662]: 2026-02-19 19:16:02.498 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:16:02 compute-0 nova_compute[186662]: 2026-02-19 19:16:02.498 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:16:02 compute-0 nova_compute[186662]: 2026-02-19 19:16:02.499 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:16:02 compute-0 nova_compute[186662]: 2026-02-19 19:16:02.499 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:16:02 compute-0 nova_compute[186662]: 2026-02-19 19:16:02.499 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:16:02 compute-0 nova_compute[186662]: 2026-02-19 19:16:02.500 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:16:03 compute-0 nova_compute[186662]: 2026-02-19 19:16:03.007 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:16:03 compute-0 nova_compute[186662]: 2026-02-19 19:16:03.007 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:16:03 compute-0 nova_compute[186662]: 2026-02-19 19:16:03.007 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:16:03 compute-0 nova_compute[186662]: 2026-02-19 19:16:03.516 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:16:03 compute-0 nova_compute[186662]: 2026-02-19 19:16:03.516 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:16:03 compute-0 nova_compute[186662]: 2026-02-19 19:16:03.516 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:16:03 compute-0 nova_compute[186662]: 2026-02-19 19:16:03.517 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:16:03 compute-0 nova_compute[186662]: 2026-02-19 19:16:03.611 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:16:03 compute-0 nova_compute[186662]: 2026-02-19 19:16:03.612 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:16:03 compute-0 nova_compute[186662]: 2026-02-19 19:16:03.627 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:16:03 compute-0 nova_compute[186662]: 2026-02-19 19:16:03.627 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6015MB free_disk=73.0661849975586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:16:03 compute-0 nova_compute[186662]: 2026-02-19 19:16:03.627 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:16:03 compute-0 nova_compute[186662]: 2026-02-19 19:16:03.627 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:16:04 compute-0 podman[197964]: 2026-02-19 19:16:04.419238594 +0000 UTC m=+2.237475592 image pull ba17a9079b86bdce2cd9e03cad5d6d4d255cc298efd741b09239c34192e5621b quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 19 19:16:04 compute-0 podman[198064]: 2026-02-19 19:16:04.51020738 +0000 UTC m=+0.035200051 container create 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, io.openshift.expose-services=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, version=9.7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 19 19:16:04 compute-0 podman[198064]: 2026-02-19 19:16:04.489609184 +0000 UTC m=+0.014601875 image pull ba17a9079b86bdce2cd9e03cad5d6d4d255cc298efd741b09239c34192e5621b quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 19 19:16:04 compute-0 python3[197951]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 19 19:16:04 compute-0 sudo[197948]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:04 compute-0 nova_compute[186662]: 2026-02-19 19:16:04.671 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:16:04 compute-0 nova_compute[186662]: 2026-02-19 19:16:04.671 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:16:03 up 47 min,  0 user,  load average: 0.64, 0.67, 0.55\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:16:04 compute-0 nova_compute[186662]: 2026-02-19 19:16:04.692 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:16:05 compute-0 nova_compute[186662]: 2026-02-19 19:16:05.203 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:16:05 compute-0 sudo[198252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzxiaftnhthevnhtkzkgrgndnbxqdnql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528565.0543852-1265-162975081246781/AnsiballZ_stat.py'
Feb 19 19:16:05 compute-0 sudo[198252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:05 compute-0 python3.9[198255]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:16:05 compute-0 sudo[198252]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:05 compute-0 nova_compute[186662]: 2026-02-19 19:16:05.713 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:16:05 compute-0 nova_compute[186662]: 2026-02-19 19:16:05.713 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.086s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:16:05 compute-0 nova_compute[186662]: 2026-02-19 19:16:05.714 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:16:06 compute-0 sudo[198407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsmrhqaatxpjwlbigqcilzdniiuelbtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528565.908287-1283-30713754942197/AnsiballZ_file.py'
Feb 19 19:16:06 compute-0 sudo[198407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:06 compute-0 python3.9[198410]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:06 compute-0 sudo[198407]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:06 compute-0 sudo[198484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idtxtuxwjahmmhvdfjuwblnwtfsvisvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528565.908287-1283-30713754942197/AnsiballZ_stat.py'
Feb 19 19:16:06 compute-0 sudo[198484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:06 compute-0 python3.9[198487]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:16:06 compute-0 sudo[198484]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:06 compute-0 podman[198488]: 2026-02-19 19:16:06.773512201 +0000 UTC m=+0.050667397 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 19 19:16:07 compute-0 sudo[198657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohxlqxslczmoorpcmwtxthttnyfyewoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528566.7417197-1283-269534976578250/AnsiballZ_copy.py'
Feb 19 19:16:07 compute-0 sudo[198657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:07 compute-0 python3.9[198660]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771528566.7417197-1283-269534976578250/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:07 compute-0 sudo[198657]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:07 compute-0 sudo[198734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixvphzgchzsexwtydtyuhpyzzvrfhrse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528566.7417197-1283-269534976578250/AnsiballZ_systemd.py'
Feb 19 19:16:07 compute-0 sudo[198734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:07 compute-0 python3.9[198737]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 19 19:16:07 compute-0 systemd[1]: Reloading.
Feb 19 19:16:07 compute-0 systemd-rc-local-generator[198762]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:16:07 compute-0 systemd-sysv-generator[198768]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:16:08 compute-0 sudo[198734]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:08 compute-0 sudo[198852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adhfkzbmibfpryauxkwbtkzlpyquwiik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528566.7417197-1283-269534976578250/AnsiballZ_systemd.py'
Feb 19 19:16:08 compute-0 sudo[198852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:08 compute-0 python3.9[198855]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:16:08 compute-0 systemd[1]: Reloading.
Feb 19 19:16:08 compute-0 systemd-sysv-generator[198880]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:16:08 compute-0 systemd-rc-local-generator[198877]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:16:08 compute-0 systemd[1]: Starting openstack_network_exporter container...
Feb 19 19:16:09 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:16:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/225e17f7bd9f2e3cf22898d51aeae063cb349f38ff7400d61ce1ee54d461831d/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 19 19:16:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/225e17f7bd9f2e3cf22898d51aeae063cb349f38ff7400d61ce1ee54d461831d/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 19 19:16:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/225e17f7bd9f2e3cf22898d51aeae063cb349f38ff7400d61ce1ee54d461831d/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 19 19:16:09 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397.
Feb 19 19:16:09 compute-0 podman[198901]: 2026-02-19 19:16:09.133608855 +0000 UTC m=+0.123232068 container init 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 19 19:16:09 compute-0 openstack_network_exporter[198916]: INFO    19:16:09 main.go:48: registering *bridge.Collector
Feb 19 19:16:09 compute-0 openstack_network_exporter[198916]: INFO    19:16:09 main.go:48: registering *coverage.Collector
Feb 19 19:16:09 compute-0 openstack_network_exporter[198916]: INFO    19:16:09 main.go:48: registering *datapath.Collector
Feb 19 19:16:09 compute-0 openstack_network_exporter[198916]: INFO    19:16:09 main.go:48: registering *iface.Collector
Feb 19 19:16:09 compute-0 openstack_network_exporter[198916]: INFO    19:16:09 main.go:48: registering *memory.Collector
Feb 19 19:16:09 compute-0 openstack_network_exporter[198916]: INFO    19:16:09 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Feb 19 19:16:09 compute-0 openstack_network_exporter[198916]: INFO    19:16:09 main.go:48: registering *ovn.Collector
Feb 19 19:16:09 compute-0 openstack_network_exporter[198916]: INFO    19:16:09 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Feb 19 19:16:09 compute-0 openstack_network_exporter[198916]: INFO    19:16:09 main.go:48: registering *pmd_perf.Collector
Feb 19 19:16:09 compute-0 openstack_network_exporter[198916]: INFO    19:16:09 main.go:48: registering *pmd_rxq.Collector
Feb 19 19:16:09 compute-0 openstack_network_exporter[198916]: INFO    19:16:09 main.go:48: registering *vswitch.Collector
Feb 19 19:16:09 compute-0 openstack_network_exporter[198916]: NOTICE  19:16:09 main.go:76: listening on https://:9105/metrics
Feb 19 19:16:09 compute-0 podman[198901]: 2026-02-19 19:16:09.163620824 +0000 UTC m=+0.153243957 container start 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 19 19:16:09 compute-0 podman[198901]: openstack_network_exporter
Feb 19 19:16:09 compute-0 systemd[1]: Started openstack_network_exporter container.
Feb 19 19:16:09 compute-0 sudo[198852]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:09 compute-0 podman[198926]: 2026-02-19 19:16:09.237877966 +0000 UTC m=+0.065864645 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, version=9.7, io.openshift.tags=minimal rhel9, release=1770267347, config_id=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z)
Feb 19 19:16:10 compute-0 python3.9[199099]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 19 19:16:11 compute-0 sudo[199249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tntctckwmkanjgiqqjfxonzhetmmdwjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528570.8701532-1373-732093054964/AnsiballZ_stat.py'
Feb 19 19:16:11 compute-0 sudo[199249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:11 compute-0 python3.9[199252]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:16:11 compute-0 sudo[199249]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:11 compute-0 sudo[199375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zavjwdqyulsannyohvjlenywohudtfor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528570.8701532-1373-732093054964/AnsiballZ_copy.py'
Feb 19 19:16:11 compute-0 sudo[199375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:11 compute-0 python3.9[199378]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528570.8701532-1373-732093054964/.source.yaml _original_basename=.6imlljhd follow=False checksum=7bdc72de8e9db3ccf2aefc2f1163e73eb1ba6e7d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:11 compute-0 sudo[199375]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:11 compute-0 podman[199379]: 2026-02-19 19:16:11.724352992 +0000 UTC m=+0.055978552 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 19 19:16:12 compute-0 sudo[199553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caektvrzupjcfobjgcxanqkgubighzdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528572.0228627-1403-35255969792341/AnsiballZ_find.py'
Feb 19 19:16:12 compute-0 sudo[199553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:12 compute-0 python3.9[199556]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 19 19:16:12 compute-0 sudo[199553]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:13 compute-0 sudo[199706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnwumhgzzidtgeysjlykftszbxknzkxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528572.83937-1422-95864851890277/AnsiballZ_podman_container_info.py'
Feb 19 19:16:13 compute-0 sudo[199706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:13 compute-0 python3.9[199709]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Feb 19 19:16:13 compute-0 sudo[199706]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:14 compute-0 sudo[199872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxhjygqorechpdzskbqglanbsenytrik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528573.6797028-1430-137459535130017/AnsiballZ_podman_container_exec.py'
Feb 19 19:16:14 compute-0 sudo[199872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:14 compute-0 python3.9[199875]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 19 19:16:14 compute-0 systemd[1]: Started libpod-conmon-57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e.scope.
Feb 19 19:16:14 compute-0 podman[199876]: 2026-02-19 19:16:14.358880999 +0000 UTC m=+0.085359084 container exec 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 19 19:16:14 compute-0 podman[199876]: 2026-02-19 19:16:14.393181148 +0000 UTC m=+0.119659203 container exec_died 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:16:14 compute-0 systemd[1]: libpod-conmon-57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e.scope: Deactivated successfully.
Feb 19 19:16:14 compute-0 sudo[199872]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:14 compute-0 sudo[200057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abojjimfaurkgkibkmakomytyerowrtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528574.5951252-1438-94947363816778/AnsiballZ_podman_container_exec.py'
Feb 19 19:16:14 compute-0 sudo[200057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:14 compute-0 python3.9[200060]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 19 19:16:15 compute-0 systemd[1]: Started libpod-conmon-57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e.scope.
Feb 19 19:16:15 compute-0 podman[200061]: 2026-02-19 19:16:15.055881346 +0000 UTC m=+0.069168442 container exec 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2)
Feb 19 19:16:15 compute-0 podman[200081]: 2026-02-19 19:16:15.118861581 +0000 UTC m=+0.053477772 container exec_died 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:16:15 compute-0 podman[200061]: 2026-02-19 19:16:15.123749597 +0000 UTC m=+0.137036703 container exec_died 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 19 19:16:15 compute-0 systemd[1]: libpod-conmon-57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e.scope: Deactivated successfully.
Feb 19 19:16:15 compute-0 sudo[200057]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:15 compute-0 sudo[200243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkcxahzsxsimejugkyjozkvdknabgmfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528575.293434-1446-44708258297628/AnsiballZ_file.py'
Feb 19 19:16:15 compute-0 sudo[200243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:15 compute-0 python3.9[200246]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:15 compute-0 sudo[200243]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:16 compute-0 sudo[200396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnvizhqddnrepolxcuiqtmdslbmdoxuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528575.9442425-1455-246652840079442/AnsiballZ_podman_container_info.py'
Feb 19 19:16:16 compute-0 sudo[200396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:16 compute-0 python3.9[200399]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Feb 19 19:16:16 compute-0 sudo[200396]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:16 compute-0 sudo[200562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fefvyyhlboyekikqrmttzqvsyaxfbxup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528576.5150433-1463-280308575406143/AnsiballZ_podman_container_exec.py'
Feb 19 19:16:16 compute-0 sudo[200562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:16 compute-0 python3.9[200565]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 19 19:16:16 compute-0 systemd[1]: Started libpod-conmon-1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1.scope.
Feb 19 19:16:16 compute-0 podman[200566]: 2026-02-19 19:16:16.983137375 +0000 UTC m=+0.085579070 container exec 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Feb 19 19:16:17 compute-0 podman[200586]: 2026-02-19 19:16:17.047864561 +0000 UTC m=+0.056004701 container exec_died 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 19 19:16:17 compute-0 podman[200566]: 2026-02-19 19:16:17.055370698 +0000 UTC m=+0.157812393 container exec_died 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:16:17 compute-0 systemd[1]: libpod-conmon-1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1.scope: Deactivated successfully.
Feb 19 19:16:17 compute-0 sudo[200562]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:17 compute-0 sudo[200748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sattwxgleymaedexmnwbrnytrpanlckk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528577.3396766-1471-151060097399086/AnsiballZ_podman_container_exec.py'
Feb 19 19:16:17 compute-0 sudo[200748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:17 compute-0 python3.9[200751]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 19 19:16:17 compute-0 systemd[1]: Started libpod-conmon-1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1.scope.
Feb 19 19:16:17 compute-0 podman[200752]: 2026-02-19 19:16:17.776655487 +0000 UTC m=+0.055026088 container exec 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 19 19:16:17 compute-0 podman[200752]: 2026-02-19 19:16:17.82308127 +0000 UTC m=+0.101451861 container exec_died 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0)
Feb 19 19:16:17 compute-0 systemd[1]: libpod-conmon-1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1.scope: Deactivated successfully.
Feb 19 19:16:17 compute-0 sudo[200748]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:18 compute-0 auditd[721]: Audit daemon rotating log files
Feb 19 19:16:18 compute-0 sudo[200934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tibajbikdeacyivjsxrhibppozqtnrbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528577.990968-1479-245904472028575/AnsiballZ_file.py'
Feb 19 19:16:18 compute-0 sudo[200934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:18 compute-0 python3.9[200937]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:18 compute-0 sudo[200934]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:18 compute-0 sudo[201087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caudyswaegpibhcvzysfzubyqnszqmed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528578.5454612-1488-112401616106462/AnsiballZ_podman_container_info.py'
Feb 19 19:16:18 compute-0 sudo[201087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:18 compute-0 python3.9[201090]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Feb 19 19:16:18 compute-0 sudo[201087]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:19 compute-0 sudo[201253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dveaqkfaiwjdndrptemjyeyulexyawsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528579.0824094-1496-152780893390976/AnsiballZ_podman_container_exec.py'
Feb 19 19:16:19 compute-0 sudo[201253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:19 compute-0 python3.9[201256]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 19 19:16:19 compute-0 systemd[1]: Started libpod-conmon-a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9.scope.
Feb 19 19:16:19 compute-0 podman[201257]: 2026-02-19 19:16:19.528880729 +0000 UTC m=+0.059881323 container exec a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:16:19 compute-0 podman[201277]: 2026-02-19 19:16:19.582840871 +0000 UTC m=+0.046893406 container exec_died a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 19:16:19 compute-0 podman[201257]: 2026-02-19 19:16:19.587559933 +0000 UTC m=+0.118560507 container exec_died a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:16:19 compute-0 systemd[1]: libpod-conmon-a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9.scope: Deactivated successfully.
Feb 19 19:16:19 compute-0 sudo[201253]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:19 compute-0 sudo[201439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebsjzhnexbkwbkjubsrxwdvimxobxqvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528579.7618327-1504-194457676102632/AnsiballZ_podman_container_exec.py'
Feb 19 19:16:19 compute-0 sudo[201439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:20 compute-0 python3.9[201442]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 19 19:16:20 compute-0 systemd[1]: Started libpod-conmon-a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9.scope.
Feb 19 19:16:20 compute-0 podman[201443]: 2026-02-19 19:16:20.238864442 +0000 UTC m=+0.080352106 container exec a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:16:20 compute-0 podman[201443]: 2026-02-19 19:16:20.270556789 +0000 UTC m=+0.112044453 container exec_died a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:16:20 compute-0 systemd[1]: libpod-conmon-a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9.scope: Deactivated successfully.
Feb 19 19:16:20 compute-0 sudo[201439]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:20 compute-0 sudo[201623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpiacinjyxvggawjhfjxckaxjivrpbwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528580.4486234-1512-8843528950013/AnsiballZ_file.py'
Feb 19 19:16:20 compute-0 sudo[201623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:20 compute-0 python3.9[201626]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:20 compute-0 sudo[201623]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:21 compute-0 sudo[201776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbxvhjrahcxlsrdrfhmdiytmeimaksie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528581.1195085-1521-37817196488478/AnsiballZ_podman_container_info.py'
Feb 19 19:16:21 compute-0 sudo[201776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:21 compute-0 python3.9[201779]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Feb 19 19:16:21 compute-0 sudo[201776]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:21 compute-0 sudo[201942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbgusmglwfbrzoafcyowaulckhhdfdhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528581.707511-1529-174007023641181/AnsiballZ_podman_container_exec.py'
Feb 19 19:16:21 compute-0 sudo[201942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:22 compute-0 python3.9[201945]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 19 19:16:22 compute-0 systemd[1]: Started libpod-conmon-7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397.scope.
Feb 19 19:16:22 compute-0 podman[201946]: 2026-02-19 19:16:22.149646193 +0000 UTC m=+0.066199973 container exec 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 19 19:16:22 compute-0 podman[201946]: 2026-02-19 19:16:22.182963388 +0000 UTC m=+0.099517128 container exec_died 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, release=1770267347, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 19 19:16:22 compute-0 systemd[1]: libpod-conmon-7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397.scope: Deactivated successfully.
Feb 19 19:16:22 compute-0 sudo[201942]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:22 compute-0 podman[201978]: 2026-02-19 19:16:22.293761531 +0000 UTC m=+0.058316286 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 19:16:22 compute-0 sudo[202149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omffbhexnfqrlzbjwutdzyoncojmmrzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528582.3747082-1537-84369493793405/AnsiballZ_podman_container_exec.py'
Feb 19 19:16:22 compute-0 sudo[202149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:22 compute-0 python3.9[202152]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 19 19:16:22 compute-0 systemd[1]: Started libpod-conmon-7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397.scope.
Feb 19 19:16:22 compute-0 podman[202153]: 2026-02-19 19:16:22.810183649 +0000 UTC m=+0.070244297 container exec 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Feb 19 19:16:22 compute-0 podman[202153]: 2026-02-19 19:16:22.839535201 +0000 UTC m=+0.099595839 container exec_died 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 19 19:16:22 compute-0 systemd[1]: libpod-conmon-7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397.scope: Deactivated successfully.
Feb 19 19:16:22 compute-0 sudo[202149]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:23 compute-0 sudo[202335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzbjtaohfpwwommjkyxsrhosyauwvmdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528583.0251863-1545-41059061648386/AnsiballZ_file.py'
Feb 19 19:16:23 compute-0 sudo[202335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:23 compute-0 python3.9[202338]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:23 compute-0 sudo[202335]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:16:32.107 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:16:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:16:32.107 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:16:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:16:32.107 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:16:37 compute-0 podman[202364]: 2026-02-19 19:16:37.286886873 +0000 UTC m=+0.057841495 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 19 19:16:37 compute-0 sudo[202508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdojipkgjcwvwgxohvyrkjgrpvmvqcyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528597.342873-1687-173760310953119/AnsiballZ_file.py'
Feb 19 19:16:37 compute-0 sudo[202508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:37 compute-0 python3.9[202511]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:37 compute-0 sudo[202508]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:38 compute-0 sudo[202661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkyheyndetcwsvnvzattwqhnvaxqiwko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528597.9882545-1703-54118142598902/AnsiballZ_stat.py'
Feb 19 19:16:38 compute-0 sudo[202661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:38 compute-0 python3.9[202664]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:16:38 compute-0 sudo[202661]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:38 compute-0 sudo[202785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjjxxfuhuoexfrnfdneaifbnfayrmsfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528597.9882545-1703-54118142598902/AnsiballZ_copy.py'
Feb 19 19:16:38 compute-0 sudo[202785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:38 compute-0 python3.9[202788]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528597.9882545-1703-54118142598902/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:38 compute-0 sudo[202785]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:39 compute-0 sudo[202948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdrapdhygxnwzfoqpvfxgcjmqjykvtwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528599.375425-1735-204796886669452/AnsiballZ_file.py'
Feb 19 19:16:39 compute-0 sudo[202948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:39 compute-0 podman[202912]: 2026-02-19 19:16:39.576965827 +0000 UTC m=+0.040825404 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1770267347, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, config_id=openstack_network_exporter)
Feb 19 19:16:39 compute-0 python3.9[202958]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:39 compute-0 sudo[202948]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:40 compute-0 sudo[203112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skdaljsoqzelzmscjgiwzneglgthjzlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528600.065694-1751-208401650608655/AnsiballZ_stat.py'
Feb 19 19:16:40 compute-0 sudo[203112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:40 compute-0 python3.9[203115]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:16:40 compute-0 sudo[203112]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:40 compute-0 sudo[203191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xauaxquedrwiconqugeagndkfflaizbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528600.065694-1751-208401650608655/AnsiballZ_file.py'
Feb 19 19:16:40 compute-0 sudo[203191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:40 compute-0 python3.9[203194]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:40 compute-0 sudo[203191]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:41 compute-0 sudo[203344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcbhniwjsgjdosjazplicrxdzqojvzlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528601.1589005-1775-45739860847153/AnsiballZ_stat.py'
Feb 19 19:16:41 compute-0 sudo[203344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:41 compute-0 python3.9[203347]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:16:41 compute-0 sudo[203344]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:41 compute-0 sudo[203423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxgpuyjgupufrbbrfkefoiljrujmjgce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528601.1589005-1775-45739860847153/AnsiballZ_file.py'
Feb 19 19:16:41 compute-0 sudo[203423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:41 compute-0 python3.9[203426]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.atq0izp6 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:41 compute-0 sudo[203423]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:41 compute-0 podman[203427]: 2026-02-19 19:16:41.988610769 +0000 UTC m=+0.064991744 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Feb 19 19:16:42 compute-0 sudo[203603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vceoflpeizdhnzscnfixsjcmycqfynsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528602.27522-1799-1657144410405/AnsiballZ_stat.py'
Feb 19 19:16:42 compute-0 sudo[203603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:42 compute-0 python3.9[203606]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:16:42 compute-0 sudo[203603]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:42 compute-0 sudo[203682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mavgoiubkijoswkdwgvunicovwdgtgpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528602.27522-1799-1657144410405/AnsiballZ_file.py'
Feb 19 19:16:42 compute-0 sudo[203682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:43 compute-0 python3.9[203685]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:43 compute-0 sudo[203682]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:43 compute-0 sudo[203835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yejafbuvofxrznfdqwtrzcbpanpfcmto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528603.5287335-1825-147571162379229/AnsiballZ_command.py'
Feb 19 19:16:43 compute-0 sudo[203835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:43 compute-0 python3.9[203838]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:16:43 compute-0 sudo[203835]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:44 compute-0 sudo[203989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-busfrwyhkubjlcdmuymdpgaddlmkrkhi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771528604.2635849-1841-260163315344725/AnsiballZ_edpm_nftables_from_files.py'
Feb 19 19:16:44 compute-0 sudo[203989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:44 compute-0 python3[203992]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 19 19:16:44 compute-0 sudo[203989]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:45 compute-0 sudo[204142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncsvezcidbxqdgpshiyxmcjbciatihui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528605.040543-1857-143226022978089/AnsiballZ_stat.py'
Feb 19 19:16:45 compute-0 sudo[204142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:45 compute-0 python3.9[204145]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:16:45 compute-0 sudo[204142]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:45 compute-0 sudo[204221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfcpguuzlhfexfnxdnnxtsdeqzoaojhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528605.040543-1857-143226022978089/AnsiballZ_file.py'
Feb 19 19:16:45 compute-0 sudo[204221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:45 compute-0 python3.9[204224]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:45 compute-0 sudo[204221]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:46 compute-0 sudo[204374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uznuaaqmubwhxyntkyraecsxwptqzypo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528606.2963507-1881-34165921156397/AnsiballZ_stat.py'
Feb 19 19:16:46 compute-0 sudo[204374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:46 compute-0 python3.9[204377]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:16:46 compute-0 sudo[204374]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:47 compute-0 sudo[204453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flycozctnncsbolyzfeotxzhziskqmcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528606.2963507-1881-34165921156397/AnsiballZ_file.py'
Feb 19 19:16:47 compute-0 sudo[204453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:47 compute-0 python3.9[204456]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:47 compute-0 sudo[204453]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:47 compute-0 sudo[204606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdgiwbiqztgnjwgnglitnnlsspozjxiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528607.4906104-1905-3761730187528/AnsiballZ_stat.py'
Feb 19 19:16:47 compute-0 sudo[204606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:47 compute-0 python3.9[204609]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:16:47 compute-0 sudo[204606]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:48 compute-0 sudo[204685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvvjuoxprcnhamnlzqojctjquvridtoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528607.4906104-1905-3761730187528/AnsiballZ_file.py'
Feb 19 19:16:48 compute-0 sudo[204685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:48 compute-0 python3.9[204688]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:48 compute-0 sudo[204685]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:48 compute-0 sudo[204838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unmyncvbxdalxvalkcshbjqxppplxrqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528608.5594966-1929-277588733464413/AnsiballZ_stat.py'
Feb 19 19:16:48 compute-0 sudo[204838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:48 compute-0 python3.9[204841]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:16:48 compute-0 sudo[204838]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:49 compute-0 sudo[204917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mykrdyrmfevesapndtjmxtbzavmqfwqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528608.5594966-1929-277588733464413/AnsiballZ_file.py'
Feb 19 19:16:49 compute-0 sudo[204917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:49 compute-0 python3.9[204920]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:49 compute-0 sudo[204917]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:49 compute-0 sudo[205070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzxvwrroytfdmtlunfqaqdzqtogoormw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528609.6691477-1953-147362406182858/AnsiballZ_stat.py'
Feb 19 19:16:49 compute-0 sudo[205070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:50 compute-0 python3.9[205073]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 19 19:16:50 compute-0 sudo[205070]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:50 compute-0 sudo[205196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsxonsfnxawpirbltxgtgdrrfldkcgnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528609.6691477-1953-147362406182858/AnsiballZ_copy.py'
Feb 19 19:16:50 compute-0 sudo[205196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:50 compute-0 python3.9[205199]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771528609.6691477-1953-147362406182858/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:50 compute-0 sudo[205196]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:51 compute-0 sudo[205349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbojmhkhoerzmcjvdqgjbtzjoysegjdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528610.882588-1983-22446424180735/AnsiballZ_file.py'
Feb 19 19:16:51 compute-0 sudo[205349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:51 compute-0 python3.9[205352]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:51 compute-0 sudo[205349]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:51 compute-0 sudo[205502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxsfxxxqblkhhwvlagkzlkvjablhycav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528611.4571679-1999-155912872667948/AnsiballZ_command.py'
Feb 19 19:16:51 compute-0 sudo[205502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:51 compute-0 python3.9[205505]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:16:51 compute-0 sudo[205502]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:52 compute-0 sudo[205675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnrfmpncfqqtjjwlrnlycuzujwwscnya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528612.1776564-2015-201480550179290/AnsiballZ_blockinfile.py'
Feb 19 19:16:52 compute-0 sudo[205675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:52 compute-0 podman[205632]: 2026-02-19 19:16:52.605721984 +0000 UTC m=+0.058843289 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:16:52 compute-0 python3.9[205686]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:52 compute-0 sudo[205675]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:53 compute-0 sudo[205836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnkxmuoktzzzkrxdzodmzgngdluitins ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528613.0972502-2033-145079413969496/AnsiballZ_command.py'
Feb 19 19:16:53 compute-0 sudo[205836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:53 compute-0 python3.9[205839]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:16:53 compute-0 sudo[205836]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:54 compute-0 sudo[205990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqtgzitbqndsjgchufemspfmrdyzivui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528613.7438471-2049-277314329259963/AnsiballZ_stat.py'
Feb 19 19:16:54 compute-0 sudo[205990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:54 compute-0 python3.9[205993]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 19 19:16:54 compute-0 sudo[205990]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:54 compute-0 sudo[206145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alvuiacojmwyjihaqwkjbhknvdblwndb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528614.4453118-2065-60409039352794/AnsiballZ_command.py'
Feb 19 19:16:54 compute-0 sudo[206145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:54 compute-0 python3.9[206148]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:16:54 compute-0 sudo[206145]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:55 compute-0 sudo[206301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asypxnfbascldifyrmnkeadefidzahnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771528615.1128094-2081-239821090519416/AnsiballZ_file.py'
Feb 19 19:16:55 compute-0 sudo[206301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:16:55 compute-0 python3.9[206304]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:16:55 compute-0 sudo[206301]: pam_unix(sudo:session): session closed for user root
Feb 19 19:16:55 compute-0 sshd-session[186971]: Connection closed by 192.168.122.30 port 34738
Feb 19 19:16:55 compute-0 sshd-session[186968]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:16:55 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Feb 19 19:16:55 compute-0 systemd[1]: session-26.scope: Consumed 1min 3.289s CPU time.
Feb 19 19:16:55 compute-0 systemd-logind[822]: Session 26 logged out. Waiting for processes to exit.
Feb 19 19:16:55 compute-0 systemd-logind[822]: Removed session 26.
Feb 19 19:16:59 compute-0 podman[196025]: time="2026-02-19T19:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:16:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:16:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2137 "" "Go-http-client/1.1"
Feb 19 19:17:00 compute-0 nova_compute[186662]: 2026-02-19 19:17:00.792 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:17:00 compute-0 nova_compute[186662]: 2026-02-19 19:17:00.793 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.303 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.303 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.304 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.304 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.305 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.305 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.305 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.306 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:17:01 compute-0 openstack_network_exporter[198916]: ERROR   19:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:17:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:17:01 compute-0 openstack_network_exporter[198916]: ERROR   19:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:17:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:17:01 compute-0 sshd-session[206338]: Accepted publickey for zuul from 38.102.83.2 port 38658 ssh2: RSA SHA256:IX/qU16dJ1hklhs5lf1q2BDqrTapkjt2lWmuMJZ/HJ0
Feb 19 19:17:01 compute-0 systemd-logind[822]: New session 27 of user zuul.
Feb 19 19:17:01 compute-0 systemd[1]: Started Session 27 of User zuul.
Feb 19 19:17:01 compute-0 sshd-session[206338]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:17:01 compute-0 sudo[206365]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvvyahuwwzxitgvnzuuwrbtmmhzjzwcc ; /usr/bin/python3'
Feb 19 19:17:01 compute-0 sudo[206365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.818 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.818 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.818 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.819 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.963 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.964 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:17:01 compute-0 python3[206367]: ansible-ansible.legacy.dnf Invoked with name=['nfs-utils', 'iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.975 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.011s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.976 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6053MB free_disk=73.00956726074219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.976 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:17:01 compute-0 nova_compute[186662]: 2026-02-19 19:17:01.976 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:17:03 compute-0 nova_compute[186662]: 2026-02-19 19:17:03.038 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:17:03 compute-0 nova_compute[186662]: 2026-02-19 19:17:03.038 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:17:01 up 48 min,  0 user,  load average: 0.37, 0.61, 0.53\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:17:03 compute-0 nova_compute[186662]: 2026-02-19 19:17:03.090 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:17:03 compute-0 sudo[206365]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:03 compute-0 sudo[206395]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpbkexzaalatsdjjixgtyefnozwytkvy ; /usr/bin/python3'
Feb 19 19:17:03 compute-0 sudo[206395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:03 compute-0 python3[206397]: ansible-community.general.ini_file Invoked with path=/etc/nfs.conf section=nfsd option=vers3 value=n backup=True mode=0644 state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:17:03 compute-0 nova_compute[186662]: 2026-02-19 19:17:03.598 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:17:03 compute-0 sudo[206395]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:03 compute-0 sudo[206423]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asbbnfloyzwshpwlmrgfqjsgiopjqddj ; /usr/bin/python3'
Feb 19 19:17:03 compute-0 sudo[206423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:04 compute-0 nova_compute[186662]: 2026-02-19 19:17:04.107 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:17:04 compute-0 nova_compute[186662]: 2026-02-19 19:17:04.107 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.131s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:17:04 compute-0 python3[206425]: ansible-ansible.builtin.systemd_service Invoked with name=rpc-statd.service masked=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Feb 19 19:17:04 compute-0 systemd[1]: Reloading.
Feb 19 19:17:04 compute-0 systemd-rc-local-generator[206446]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:17:04 compute-0 systemd-sysv-generator[206451]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:17:04 compute-0 sudo[206423]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:04 compute-0 sudo[206493]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uneksfnozyvpregkajufprokeqvsvbgp ; /usr/bin/python3'
Feb 19 19:17:04 compute-0 sudo[206493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:04 compute-0 python3[206495]: ansible-ansible.builtin.systemd_service Invoked with name=rpcbind.service masked=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Feb 19 19:17:04 compute-0 systemd[1]: Reloading.
Feb 19 19:17:04 compute-0 systemd-sysv-generator[206519]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:17:04 compute-0 systemd-rc-local-generator[206515]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:17:05 compute-0 systemd[1]: rpcbind.service: Current command vanished from the unit file, execution of the command list won't be resumed.
Feb 19 19:17:05 compute-0 sudo[206493]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:05 compute-0 sudo[206563]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsaqweykadujpanuaxhvbzavvdfovpgi ; /usr/bin/python3'
Feb 19 19:17:05 compute-0 sudo[206563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:05 compute-0 python3[206565]: ansible-ansible.builtin.systemd_service Invoked with name=rpcbind.socket masked=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Feb 19 19:17:05 compute-0 systemd[1]: Reloading.
Feb 19 19:17:05 compute-0 systemd-sysv-generator[206594]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:17:05 compute-0 systemd-rc-local-generator[206591]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:17:05 compute-0 systemd[1]: rpcbind.socket: Socket unit configuration has changed while unit has been running, no open socket file descriptor left. The socket unit is not functional until restarted.
Feb 19 19:17:05 compute-0 sudo[206563]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:05 compute-0 sudo[206633]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujzuygffmchjskgkpzlrfhqdqucehqwg ; /usr/bin/python3'
Feb 19 19:17:05 compute-0 sudo[206633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:05 compute-0 python3[206635]: ansible-ansible.builtin.file Invoked with path=/data/cinder_backend_1 state=directory mode=755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:17:05 compute-0 sudo[206633]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:06 compute-0 sudo[206659]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drgkbmjtjgyofeumtpdkwezvnkbmggmi ; /usr/bin/python3'
Feb 19 19:17:06 compute-0 sudo[206659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:06 compute-0 python3[206661]: ansible-ansible.builtin.file Invoked with path=/data/cinder_backend_2 state=directory mode=755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:17:06 compute-0 sudo[206659]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:06 compute-0 sudo[206685]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivjwbbgdcrxljgjvhoombwscqmydmhim ; /usr/bin/python3'
Feb 19 19:17:06 compute-0 sudo[206685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:06 compute-0 python3[206687]: ansible-ansible.builtin.file Invoked with path=/data/cinderbackup state=directory mode=755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:17:06 compute-0 sudo[206685]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:07 compute-0 sudo[206763]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xadrubtrbdkwznvzeproiczlvsahbaph ; /usr/bin/python3'
Feb 19 19:17:07 compute-0 sudo[206763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:08 compute-0 podman[206765]: 2026-02-19 19:17:08.009282543 +0000 UTC m=+0.047878400 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Feb 19 19:17:08 compute-0 python3[206766]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/nfs-server.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 19 19:17:08 compute-0 sudo[206763]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:08 compute-0 sudo[206856]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxvwgrvxfpaiwrgqygjgbcplzoxqzmrw ; /usr/bin/python3'
Feb 19 19:17:08 compute-0 sudo[206856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:08 compute-0 python3[206858]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/nfs-server.nft mode=0666 src=/home/zuul/.ansible/tmp/ansible-tmp-1771528627.8375463-37411-55763565595165/source _original_basename=tmpm37em_16 follow=False checksum=f91e6a2e98f3d3c48705976f5b33f9e81e7cf7f4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:17:08 compute-0 sudo[206856]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:08 compute-0 sudo[206906]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbdwzamcyecvztglgqsaycopivfgxhts ; /usr/bin/python3'
Feb 19 19:17:08 compute-0 sudo[206906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:08 compute-0 python3[206908]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/sysconfig/nftables.conf line=include "/etc/nftables/nfs-server.nft" insertafter=EOF state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:17:08 compute-0 sudo[206906]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:09 compute-0 sudo[206932]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqpeaccehpwzmuzchvxirkrbjmcthyms ; /usr/bin/python3'
Feb 19 19:17:09 compute-0 sudo[206932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:09 compute-0 python3[206934]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 19 19:17:09 compute-0 systemd[1]: Stopping Netfilter Tables...
Feb 19 19:17:09 compute-0 systemd[1]: nftables.service: Deactivated successfully.
Feb 19 19:17:09 compute-0 systemd[1]: Stopped Netfilter Tables.
Feb 19 19:17:09 compute-0 systemd[1]: Starting Netfilter Tables...
Feb 19 19:17:09 compute-0 systemd[1]: Finished Netfilter Tables.
Feb 19 19:17:09 compute-0 sudo[206932]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:09 compute-0 podman[206938]: 2026-02-19 19:17:09.7100107 +0000 UTC m=+0.060548328 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, version=9.7, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347)
Feb 19 19:17:09 compute-0 sudo[206983]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elhuyddorxbtrqtlbtwrzmuymlywdhba ; /usr/bin/python3'
Feb 19 19:17:09 compute-0 sudo[206983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:09 compute-0 python3[206985]: ansible-community.general.ini_file Invoked with path=/etc/nfs.conf section=nfsd option=host value=172.18.0.100 backup=True mode=0644 state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:17:09 compute-0 sudo[206983]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:10 compute-0 sudo[207011]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiknwyfslwobyhoswjzdpmvvyovrlcyn ; /usr/bin/python3'
Feb 19 19:17:10 compute-0 sudo[207011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:10 compute-0 python3[207013]: ansible-ansible.builtin.systemd Invoked with name=nfs-server state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 19 19:17:10 compute-0 systemd[1]: Reloading.
Feb 19 19:17:10 compute-0 systemd-sysv-generator[207047]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 19 19:17:10 compute-0 systemd-rc-local-generator[207044]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 19 19:17:10 compute-0 systemd[1]: rpcbind.socket: Socket unit configuration has changed while unit has been running, no open socket file descriptor left. The socket unit is not functional until restarted.
Feb 19 19:17:10 compute-0 systemd[1]: Mounting NFSD configuration filesystem...
Feb 19 19:17:10 compute-0 systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 19 19:17:10 compute-0 systemd[1]: Starting NFSv4 ID-name mapping service...
Feb 19 19:17:10 compute-0 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 19 19:17:10 compute-0 rpc.idmapd[207062]: Setting log level to 0
Feb 19 19:17:10 compute-0 systemd[1]: Started NFSv4 ID-name mapping service.
Feb 19 19:17:10 compute-0 systemd[1]: Mounted NFSD configuration filesystem.
Feb 19 19:17:10 compute-0 systemd[1]: Starting NFS Mount Daemon...
Feb 19 19:17:10 compute-0 systemd[1]: Starting NFSv4 Client Tracking Daemon...
Feb 19 19:17:10 compute-0 systemd[1]: Started NFSv4 Client Tracking Daemon.
Feb 19 19:17:10 compute-0 rpc.mountd[207069]: Version 2.5.4 starting
Feb 19 19:17:10 compute-0 systemd[1]: Started NFS Mount Daemon.
Feb 19 19:17:10 compute-0 systemd[1]: Starting NFS server and services...
Feb 19 19:17:10 compute-0 kernel: RPC: Registered rdma transport module.
Feb 19 19:17:10 compute-0 kernel: RPC: Registered rdma backchannel transport module.
Feb 19 19:17:10 compute-0 kernel: NFSD: Using nfsdcld client tracking operations.
Feb 19 19:17:10 compute-0 kernel: NFSD: no clients to reclaim, skipping NFSv4 grace period (net f0000000)
Feb 19 19:17:10 compute-0 systemd[1]: Reloading GSSAPI Proxy Daemon...
Feb 19 19:17:11 compute-0 systemd[1]: Reloaded GSSAPI Proxy Daemon.
Feb 19 19:17:11 compute-0 systemd[1]: Finished NFS server and services.
Feb 19 19:17:11 compute-0 sudo[207011]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:11 compute-0 sudo[207111]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbffwcvyzgbbwxgmlqiwjzgqofkatdvb ; /usr/bin/python3'
Feb 19 19:17:11 compute-0 sudo[207111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:11 compute-0 python3[207113]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/exports line=/data/cinder_backend_1 172.18.0.0/24(rw,sync,no_root_squash) state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:17:11 compute-0 sudo[207111]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:11 compute-0 sudo[207137]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pchbaursxfmnqveuvrbyvgowqzunexho ; /usr/bin/python3'
Feb 19 19:17:11 compute-0 sudo[207137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:11 compute-0 python3[207139]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/exports line=/data/cinder_backend_2 172.18.0.0/24(rw,sync,no_root_squash) state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:17:11 compute-0 sudo[207137]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:11 compute-0 sudo[207163]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-licyzcctbjexyqwejgkgxgmhobnvrder ; /usr/bin/python3'
Feb 19 19:17:11 compute-0 sudo[207163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:11 compute-0 python3[207165]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/exports line=/data/cinderbackup 172.18.0.0/24(rw,sync,no_root_squash) state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 19 19:17:11 compute-0 sudo[207163]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:11 compute-0 sudo[207189]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptcalzbulglyhqfaigfamudgqzakobjp ; /usr/bin/python3'
Feb 19 19:17:11 compute-0 sudo[207189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:17:11 compute-0 python3[207191]: ansible-ansible.legacy.command Invoked with _raw_params=exportfs -a _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 19 19:17:11 compute-0 sudo[207189]: pam_unix(sudo:session): session closed for user root
Feb 19 19:17:12 compute-0 podman[207193]: 2026-02-19 19:17:12.309996084 +0000 UTC m=+0.075798998 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 19 19:17:23 compute-0 podman[207219]: 2026-02-19 19:17:23.259399122 +0000 UTC m=+0.040543939 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:17:26 compute-0 sshd-session[206371]: ssh_dispatch_run_fatal: Connection from 213.177.179.91 port 48904: Connection timed out [preauth]
Feb 19 19:17:29 compute-0 podman[196025]: time="2026-02-19T19:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:17:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:17:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2144 "" "Go-http-client/1.1"
Feb 19 19:17:31 compute-0 openstack_network_exporter[198916]: ERROR   19:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:17:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:17:31 compute-0 openstack_network_exporter[198916]: ERROR   19:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:17:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:17:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:17:32.108 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:17:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:17:32.108 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:17:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:17:32.108 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:17:36 compute-0 rpc.mountd[207069]: v4.1 client attached: 0xea403b29699761b6 from "172.18.0.36:789"
Feb 19 19:17:38 compute-0 podman[207253]: 2026-02-19 19:17:38.271765348 +0000 UTC m=+0.049664925 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 19 19:17:40 compute-0 podman[207273]: 2026-02-19 19:17:40.286315121 +0000 UTC m=+0.052117424 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, maintainer=Red Hat, Inc.)
Feb 19 19:17:43 compute-0 podman[207294]: 2026-02-19 19:17:43.315727214 +0000 UTC m=+0.092000465 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 19 19:17:54 compute-0 podman[207319]: 2026-02-19 19:17:54.283522098 +0000 UTC m=+0.050380923 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:17:59 compute-0 podman[196025]: time="2026-02-19T19:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:17:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:17:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2147 "" "Go-http-client/1.1"
Feb 19 19:18:00 compute-0 sshd-session[207345]: Invalid user oracle from 138.255.157.62 port 2702
Feb 19 19:18:01 compute-0 sshd-session[207345]: Received disconnect from 138.255.157.62 port 2702:11: Bye Bye [preauth]
Feb 19 19:18:01 compute-0 sshd-session[207345]: Disconnected from invalid user oracle 138.255.157.62 port 2702 [preauth]
Feb 19 19:18:01 compute-0 openstack_network_exporter[198916]: ERROR   19:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:18:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:18:01 compute-0 openstack_network_exporter[198916]: ERROR   19:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:18:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.109 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.109 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.109 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.110 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.110 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.110 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.110 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.110 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.111 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.622 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.623 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.624 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.624 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.731 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.733 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.743 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.744 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6134MB free_disk=73.0095329284668GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.744 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:18:04 compute-0 nova_compute[186662]: 2026-02-19 19:18:04.744 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:18:05 compute-0 nova_compute[186662]: 2026-02-19 19:18:05.782 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:18:05 compute-0 nova_compute[186662]: 2026-02-19 19:18:05.782 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:18:04 up 49 min,  0 user,  load average: 0.21, 0.52, 0.51\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:18:05 compute-0 nova_compute[186662]: 2026-02-19 19:18:05.804 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:18:06 compute-0 nova_compute[186662]: 2026-02-19 19:18:06.310 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:18:06 compute-0 nova_compute[186662]: 2026-02-19 19:18:06.818 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:18:06 compute-0 nova_compute[186662]: 2026-02-19 19:18:06.819 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.075s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:18:09 compute-0 podman[207348]: 2026-02-19 19:18:09.276883888 +0000 UTC m=+0.050853073 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:18:11 compute-0 podman[207368]: 2026-02-19 19:18:11.278106643 +0000 UTC m=+0.053822405 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9/ubi-minimal, release=1770267347, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 19 19:18:14 compute-0 podman[207389]: 2026-02-19 19:18:14.281228159 +0000 UTC m=+0.057710207 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Feb 19 19:18:21 compute-0 sshd-session[207416]: Invalid user claude from 182.75.216.74 port 19433
Feb 19 19:18:21 compute-0 sshd-session[207416]: Received disconnect from 182.75.216.74 port 19433:11: Bye Bye [preauth]
Feb 19 19:18:21 compute-0 sshd-session[207416]: Disconnected from invalid user claude 182.75.216.74 port 19433 [preauth]
Feb 19 19:18:25 compute-0 podman[207418]: 2026-02-19 19:18:25.261564394 +0000 UTC m=+0.043269493 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:18:29 compute-0 podman[196025]: time="2026-02-19T19:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:18:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:18:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2146 "" "Go-http-client/1.1"
Feb 19 19:18:31 compute-0 openstack_network_exporter[198916]: ERROR   19:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:18:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:18:31 compute-0 openstack_network_exporter[198916]: ERROR   19:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:18:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:18:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:18:32.109 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:18:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:18:32.109 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:18:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:18:32.109 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:18:40 compute-0 podman[207443]: 2026-02-19 19:18:40.262104786 +0000 UTC m=+0.043914717 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:18:42 compute-0 podman[207462]: 2026-02-19 19:18:42.29088095 +0000 UTC m=+0.062800868 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 19 19:18:45 compute-0 podman[207483]: 2026-02-19 19:18:45.316810042 +0000 UTC m=+0.092064584 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Feb 19 19:18:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:18:55.060 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:18:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:18:55.060 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:18:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:18:55.062 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:18:56 compute-0 podman[207511]: 2026-02-19 19:18:56.276866586 +0000 UTC m=+0.053638395 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:18:59 compute-0 podman[196025]: time="2026-02-19T19:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:18:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:18:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2150 "" "Go-http-client/1.1"
Feb 19 19:19:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:00.199 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:45:ca 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-7f3dd1ac-c7ae-4174-af19-9ff6288733d1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f3dd1ac-c7ae-4174-af19-9ff6288733d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '084bf37190834c4d9a8f0459d9d05ec7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb17300e-c4e9-4360-9e93-e14f5aa72f00, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=24556d05-3912-47f4-afb7-5e27caf8d629) old=Port_Binding(mac=['fa:16:3e:f8:45:ca'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-7f3dd1ac-c7ae-4174-af19-9ff6288733d1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f3dd1ac-c7ae-4174-af19-9ff6288733d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '084bf37190834c4d9a8f0459d9d05ec7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:19:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:00.201 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 24556d05-3912-47f4-afb7-5e27caf8d629 in datapath 7f3dd1ac-c7ae-4174-af19-9ff6288733d1 updated
Feb 19 19:19:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:00.203 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7f3dd1ac-c7ae-4174-af19-9ff6288733d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:19:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:00.204 105986 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp9r3vjl_4/privsep.sock']
Feb 19 19:19:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:00.863 105986 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 19 19:19:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:00.864 105986 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp9r3vjl_4/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Feb 19 19:19:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:00.737 207540 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 19 19:19:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:00.742 207540 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 19 19:19:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:00.745 207540 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 19 19:19:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:00.745 207540 INFO oslo.privsep.daemon [-] privsep daemon running as pid 207540
Feb 19 19:19:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:00.866 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[8b91258a-cacd-4a8d-8296-30795ced86f7]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:19:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:01.309 207540 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:19:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:01.309 207540 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:19:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:01.310 207540 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:19:01 compute-0 openstack_network_exporter[198916]: ERROR   19:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:19:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:19:01 compute-0 openstack_network_exporter[198916]: ERROR   19:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:19:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:19:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:01.728 207540 INFO oslo_service.backend [-] Loading backend: eventlet
Feb 19 19:19:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:01.732 207540 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Feb 19 19:19:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:01.762 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[3cf598bc-eafc-4123-9d2a-b3fbb7138c83]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:19:05 compute-0 nova_compute[186662]: 2026-02-19 19:19:05.280 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:19:05 compute-0 nova_compute[186662]: 2026-02-19 19:19:05.280 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:19:05 compute-0 nova_compute[186662]: 2026-02-19 19:19:05.791 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:19:05 compute-0 nova_compute[186662]: 2026-02-19 19:19:05.791 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:19:05 compute-0 nova_compute[186662]: 2026-02-19 19:19:05.791 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:19:05 compute-0 nova_compute[186662]: 2026-02-19 19:19:05.791 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:19:05 compute-0 nova_compute[186662]: 2026-02-19 19:19:05.791 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:19:05 compute-0 nova_compute[186662]: 2026-02-19 19:19:05.791 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:19:05 compute-0 nova_compute[186662]: 2026-02-19 19:19:05.791 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:19:05 compute-0 nova_compute[186662]: 2026-02-19 19:19:05.792 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:19:06 compute-0 nova_compute[186662]: 2026-02-19 19:19:06.302 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:19:06 compute-0 nova_compute[186662]: 2026-02-19 19:19:06.302 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:19:06 compute-0 nova_compute[186662]: 2026-02-19 19:19:06.303 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:19:06 compute-0 nova_compute[186662]: 2026-02-19 19:19:06.303 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:19:06 compute-0 nova_compute[186662]: 2026-02-19 19:19:06.414 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:19:06 compute-0 nova_compute[186662]: 2026-02-19 19:19:06.415 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:19:06 compute-0 nova_compute[186662]: 2026-02-19 19:19:06.441 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:19:06 compute-0 nova_compute[186662]: 2026-02-19 19:19:06.441 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6054MB free_disk=73.01530838012695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:19:06 compute-0 nova_compute[186662]: 2026-02-19 19:19:06.442 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:19:06 compute-0 nova_compute[186662]: 2026-02-19 19:19:06.442 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:19:07 compute-0 nova_compute[186662]: 2026-02-19 19:19:07.479 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:19:07 compute-0 nova_compute[186662]: 2026-02-19 19:19:07.479 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:19:06 up 50 min,  0 user,  load average: 0.14, 0.43, 0.48\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:19:07 compute-0 nova_compute[186662]: 2026-02-19 19:19:07.552 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:19:08 compute-0 nova_compute[186662]: 2026-02-19 19:19:08.060 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:19:08 compute-0 nova_compute[186662]: 2026-02-19 19:19:08.567 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:19:08 compute-0 nova_compute[186662]: 2026-02-19 19:19:08.568 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.126s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:19:08 compute-0 sshd-session[207546]: Invalid user n8n from 27.50.25.190 port 59808
Feb 19 19:19:09 compute-0 sshd-session[207546]: Received disconnect from 27.50.25.190 port 59808:11: Bye Bye [preauth]
Feb 19 19:19:09 compute-0 sshd-session[207546]: Disconnected from invalid user n8n 27.50.25.190 port 59808 [preauth]
Feb 19 19:19:11 compute-0 podman[207548]: 2026-02-19 19:19:11.257601787 +0000 UTC m=+0.038859740 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Feb 19 19:19:13 compute-0 podman[207567]: 2026-02-19 19:19:13.25941314 +0000 UTC m=+0.041648315 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z)
Feb 19 19:19:16 compute-0 podman[207588]: 2026-02-19 19:19:16.308918213 +0000 UTC m=+0.083955734 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Feb 19 19:19:24 compute-0 sshd-session[207615]: Received disconnect from 197.211.55.20 port 33144:11: Bye Bye [preauth]
Feb 19 19:19:24 compute-0 sshd-session[207615]: Disconnected from authenticating user root 197.211.55.20 port 33144 [preauth]
Feb 19 19:19:27 compute-0 podman[207617]: 2026-02-19 19:19:27.286405924 +0000 UTC m=+0.061935439 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:19:29 compute-0 podman[196025]: time="2026-02-19T19:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:19:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:19:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2148 "" "Go-http-client/1.1"
Feb 19 19:19:31 compute-0 openstack_network_exporter[198916]: ERROR   19:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:19:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:19:31 compute-0 openstack_network_exporter[198916]: ERROR   19:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:19:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:19:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:32.110 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:19:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:32.111 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:19:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:19:32.111 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:19:42 compute-0 podman[207642]: 2026-02-19 19:19:42.280508291 +0000 UTC m=+0.055689233 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:19:42 compute-0 rsyslogd[1019]: imjournal: 962 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 19 19:19:44 compute-0 podman[207661]: 2026-02-19 19:19:44.267373062 +0000 UTC m=+0.048535897 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, release=1770267347, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter)
Feb 19 19:19:47 compute-0 sshd-session[207682]: Received disconnect from 195.178.110.15 port 29954:11:  [preauth]
Feb 19 19:19:47 compute-0 sshd-session[207682]: Disconnected from authenticating user root 195.178.110.15 port 29954 [preauth]
Feb 19 19:19:47 compute-0 podman[207684]: 2026-02-19 19:19:47.312355002 +0000 UTC m=+0.089576917 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible)
Feb 19 19:19:52 compute-0 sshd-session[207711]: Received disconnect from 189.165.79.177 port 50768:11: Bye Bye [preauth]
Feb 19 19:19:52 compute-0 sshd-session[207711]: Disconnected from authenticating user root 189.165.79.177 port 50768 [preauth]
Feb 19 19:19:57 compute-0 nova_compute[186662]: 2026-02-19 19:19:57.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:19:57 compute-0 nova_compute[186662]: 2026-02-19 19:19:57.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Feb 19 19:19:57 compute-0 podman[207713]: 2026-02-19 19:19:57.644276899 +0000 UTC m=+0.041996216 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:19:58 compute-0 nova_compute[186662]: 2026-02-19 19:19:58.082 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Feb 19 19:19:58 compute-0 nova_compute[186662]: 2026-02-19 19:19:58.082 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:19:58 compute-0 nova_compute[186662]: 2026-02-19 19:19:58.082 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Feb 19 19:19:58 compute-0 nova_compute[186662]: 2026-02-19 19:19:58.589 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:19:59 compute-0 podman[196025]: time="2026-02-19T19:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:19:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:19:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2149 "" "Go-http-client/1.1"
Feb 19 19:20:01 compute-0 nova_compute[186662]: 2026-02-19 19:20:01.095 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:20:01 compute-0 nova_compute[186662]: 2026-02-19 19:20:01.095 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:20:01 compute-0 openstack_network_exporter[198916]: ERROR   19:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:20:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:20:01 compute-0 openstack_network_exporter[198916]: ERROR   19:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:20:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:20:01 compute-0 nova_compute[186662]: 2026-02-19 19:20:01.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:20:01 compute-0 nova_compute[186662]: 2026-02-19 19:20:01.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:20:02 compute-0 nova_compute[186662]: 2026-02-19 19:20:02.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:20:02 compute-0 nova_compute[186662]: 2026-02-19 19:20:02.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:20:03 compute-0 nova_compute[186662]: 2026-02-19 19:20:03.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:20:03 compute-0 nova_compute[186662]: 2026-02-19 19:20:03.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:20:03 compute-0 nova_compute[186662]: 2026-02-19 19:20:03.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:20:04 compute-0 nova_compute[186662]: 2026-02-19 19:20:04.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:20:04 compute-0 nova_compute[186662]: 2026-02-19 19:20:04.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:20:04 compute-0 nova_compute[186662]: 2026-02-19 19:20:04.089 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:20:04 compute-0 nova_compute[186662]: 2026-02-19 19:20:04.089 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:20:04 compute-0 nova_compute[186662]: 2026-02-19 19:20:04.207 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:20:04 compute-0 nova_compute[186662]: 2026-02-19 19:20:04.208 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:20:04 compute-0 nova_compute[186662]: 2026-02-19 19:20:04.217 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.009s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:20:04 compute-0 nova_compute[186662]: 2026-02-19 19:20:04.217 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6073MB free_disk=73.01117324829102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:20:04 compute-0 nova_compute[186662]: 2026-02-19 19:20:04.217 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:20:04 compute-0 nova_compute[186662]: 2026-02-19 19:20:04.218 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:20:05 compute-0 nova_compute[186662]: 2026-02-19 19:20:05.262 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:20:05 compute-0 nova_compute[186662]: 2026-02-19 19:20:05.262 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:20:04 up 51 min,  0 user,  load average: 0.09, 0.37, 0.45\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:20:05 compute-0 nova_compute[186662]: 2026-02-19 19:20:05.276 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:20:05 compute-0 nova_compute[186662]: 2026-02-19 19:20:05.783 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:20:06 compute-0 nova_compute[186662]: 2026-02-19 19:20:06.291 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:20:06 compute-0 nova_compute[186662]: 2026-02-19 19:20:06.291 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.074s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:20:13 compute-0 podman[207737]: 2026-02-19 19:20:13.296765607 +0000 UTC m=+0.074188398 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:20:15 compute-0 podman[207757]: 2026-02-19 19:20:15.262814526 +0000 UTC m=+0.042588810 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, version=9.7, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z)
Feb 19 19:20:18 compute-0 podman[207779]: 2026-02-19 19:20:18.31938963 +0000 UTC m=+0.091941114 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Feb 19 19:20:28 compute-0 podman[207806]: 2026-02-19 19:20:28.272389108 +0000 UTC m=+0.050013253 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:20:29 compute-0 podman[196025]: time="2026-02-19T19:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:20:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:20:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2147 "" "Go-http-client/1.1"
Feb 19 19:20:31 compute-0 openstack_network_exporter[198916]: ERROR   19:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:20:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:20:31 compute-0 openstack_network_exporter[198916]: ERROR   19:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:20:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:20:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:20:32.112 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:20:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:20:32.112 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:20:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:20:32.112 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:20:44 compute-0 podman[207834]: 2026-02-19 19:20:44.295969833 +0000 UTC m=+0.069183342 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 19 19:20:46 compute-0 podman[207854]: 2026-02-19 19:20:46.260238285 +0000 UTC m=+0.042453000 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.7, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z)
Feb 19 19:20:49 compute-0 podman[207875]: 2026-02-19 19:20:49.278570669 +0000 UTC m=+0.059540150 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:20:59 compute-0 podman[207902]: 2026-02-19 19:20:59.264447176 +0000 UTC m=+0.046410705 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:20:59 compute-0 podman[196025]: time="2026-02-19T19:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:20:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:20:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2150 "" "Go-http-client/1.1"
Feb 19 19:21:01 compute-0 openstack_network_exporter[198916]: ERROR   19:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:21:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:21:01 compute-0 openstack_network_exporter[198916]: ERROR   19:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:21:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:21:04 compute-0 nova_compute[186662]: 2026-02-19 19:21:04.291 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:21:04 compute-0 nova_compute[186662]: 2026-02-19 19:21:04.882 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:21:04 compute-0 nova_compute[186662]: 2026-02-19 19:21:04.882 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:21:04 compute-0 nova_compute[186662]: 2026-02-19 19:21:04.882 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:21:04 compute-0 nova_compute[186662]: 2026-02-19 19:21:04.883 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:21:04 compute-0 nova_compute[186662]: 2026-02-19 19:21:04.883 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:21:04 compute-0 nova_compute[186662]: 2026-02-19 19:21:04.883 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:21:05 compute-0 nova_compute[186662]: 2026-02-19 19:21:05.162 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:21:05 compute-0 nova_compute[186662]: 2026-02-19 19:21:05.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:21:05 compute-0 nova_compute[186662]: 2026-02-19 19:21:05.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:21:06 compute-0 nova_compute[186662]: 2026-02-19 19:21:06.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:21:06 compute-0 nova_compute[186662]: 2026-02-19 19:21:06.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:21:06 compute-0 nova_compute[186662]: 2026-02-19 19:21:06.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:21:06 compute-0 nova_compute[186662]: 2026-02-19 19:21:06.089 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:21:06 compute-0 nova_compute[186662]: 2026-02-19 19:21:06.221 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:21:06 compute-0 nova_compute[186662]: 2026-02-19 19:21:06.222 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:21:06 compute-0 nova_compute[186662]: 2026-02-19 19:21:06.234 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:21:06 compute-0 nova_compute[186662]: 2026-02-19 19:21:06.235 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6065MB free_disk=73.01122665405273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:21:06 compute-0 nova_compute[186662]: 2026-02-19 19:21:06.235 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:21:06 compute-0 nova_compute[186662]: 2026-02-19 19:21:06.236 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:21:07 compute-0 nova_compute[186662]: 2026-02-19 19:21:07.320 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:21:07 compute-0 nova_compute[186662]: 2026-02-19 19:21:07.320 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:21:06 up 52 min,  0 user,  load average: 0.03, 0.30, 0.42\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:21:07 compute-0 nova_compute[186662]: 2026-02-19 19:21:07.362 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing inventories for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Feb 19 19:21:07 compute-0 nova_compute[186662]: 2026-02-19 19:21:07.399 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating ProviderTree inventory for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Feb 19 19:21:07 compute-0 nova_compute[186662]: 2026-02-19 19:21:07.399 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:21:07 compute-0 nova_compute[186662]: 2026-02-19 19:21:07.414 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing aggregate associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Feb 19 19:21:07 compute-0 nova_compute[186662]: 2026-02-19 19:21:07.432 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing trait associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_SSE2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,HW_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_USB _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Feb 19 19:21:07 compute-0 nova_compute[186662]: 2026-02-19 19:21:07.497 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:21:08 compute-0 nova_compute[186662]: 2026-02-19 19:21:08.004 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:21:08 compute-0 nova_compute[186662]: 2026-02-19 19:21:08.518 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:21:08 compute-0 nova_compute[186662]: 2026-02-19 19:21:08.519 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.283s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:21:15 compute-0 podman[207927]: 2026-02-19 19:21:15.257456815 +0000 UTC m=+0.039061129 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 19 19:21:17 compute-0 podman[207947]: 2026-02-19 19:21:17.272595809 +0000 UTC m=+0.053396033 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1770267347, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, vendor=Red Hat, Inc., version=9.7)
Feb 19 19:21:20 compute-0 podman[207968]: 2026-02-19 19:21:20.278616136 +0000 UTC m=+0.060072872 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 19 19:21:29 compute-0 podman[196025]: time="2026-02-19T19:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:21:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:21:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2148 "" "Go-http-client/1.1"
Feb 19 19:21:29 compute-0 podman[207995]: 2026-02-19 19:21:29.809302269 +0000 UTC m=+0.041070797 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:21:31 compute-0 openstack_network_exporter[198916]: ERROR   19:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:21:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:21:31 compute-0 openstack_network_exporter[198916]: ERROR   19:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:21:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:21:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:32.113 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:21:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:32.113 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:21:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:32.113 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:21:35 compute-0 sshd-session[208020]: Invalid user cma from 182.75.216.74 port 29509
Feb 19 19:21:35 compute-0 sshd-session[208020]: Received disconnect from 182.75.216.74 port 29509:11: Bye Bye [preauth]
Feb 19 19:21:35 compute-0 sshd-session[208020]: Disconnected from invalid user cma 182.75.216.74 port 29509 [preauth]
Feb 19 19:21:45 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:45.880 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:21:45 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:45.881 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:21:46 compute-0 podman[208023]: 2026-02-19 19:21:46.28739943 +0000 UTC m=+0.059879218 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 19 19:21:48 compute-0 podman[208045]: 2026-02-19 19:21:48.272743833 +0000 UTC m=+0.050687798 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347)
Feb 19 19:21:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:48.297 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:c8:f8 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e6ae37e9-b920-4375-a4a8-c6d82ba43175', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6ae37e9-b920-4375-a4a8-c6d82ba43175', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c26564610734c7780dc5f7e669d7347', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cdcb6030-846a-44c2-9d48-9fccef703f9e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4c6d4d92-979a-417f-b204-ccd4f1632bfd) old=Port_Binding(mac=['fa:16:3e:b0:c8:f8'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e6ae37e9-b920-4375-a4a8-c6d82ba43175', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6ae37e9-b920-4375-a4a8-c6d82ba43175', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c26564610734c7780dc5f7e669d7347', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:21:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:48.298 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4c6d4d92-979a-417f-b204-ccd4f1632bfd in datapath e6ae37e9-b920-4375-a4a8-c6d82ba43175 updated
Feb 19 19:21:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:48.299 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e6ae37e9-b920-4375-a4a8-c6d82ba43175, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:21:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:48.300 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b10233d3-0bf2-43ab-9f29-49ca59257c0b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:21:51 compute-0 podman[208066]: 2026-02-19 19:21:51.335417564 +0000 UTC m=+0.115486444 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216)
Feb 19 19:21:51 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:51.882 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:21:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:59.254 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:07:85 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-517bb5ab-a2a0-4482-9899-4dff8db55c4b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-517bb5ab-a2a0-4482-9899-4dff8db55c4b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a3222ad9cd94764a64cfc7797f8ff3f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e302ba89-e2c1-490c-9af2-40da0eea471b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fe0ffb4f-5cf6-4d22-a090-5a28514d5e4d) old=Port_Binding(mac=['fa:16:3e:94:07:85'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-517bb5ab-a2a0-4482-9899-4dff8db55c4b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-517bb5ab-a2a0-4482-9899-4dff8db55c4b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a3222ad9cd94764a64cfc7797f8ff3f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:21:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:59.255 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fe0ffb4f-5cf6-4d22-a090-5a28514d5e4d in datapath 517bb5ab-a2a0-4482-9899-4dff8db55c4b updated
Feb 19 19:21:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:59.255 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 517bb5ab-a2a0-4482-9899-4dff8db55c4b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:21:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:21:59.256 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[870f8f03-e6bc-4286-8741-d3ecf7a16db0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:21:59 compute-0 podman[196025]: time="2026-02-19T19:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:21:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:21:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2151 "" "Go-http-client/1.1"
Feb 19 19:22:00 compute-0 podman[208093]: 2026-02-19 19:22:00.264505051 +0000 UTC m=+0.041363435 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:22:01 compute-0 openstack_network_exporter[198916]: ERROR   19:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:22:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:22:01 compute-0 openstack_network_exporter[198916]: ERROR   19:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:22:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:22:04 compute-0 nova_compute[186662]: 2026-02-19 19:22:04.519 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:22:04 compute-0 nova_compute[186662]: 2026-02-19 19:22:04.519 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:22:04 compute-0 nova_compute[186662]: 2026-02-19 19:22:04.519 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:22:04 compute-0 nova_compute[186662]: 2026-02-19 19:22:04.519 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:22:05 compute-0 nova_compute[186662]: 2026-02-19 19:22:05.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:22:05 compute-0 nova_compute[186662]: 2026-02-19 19:22:05.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:22:06 compute-0 nova_compute[186662]: 2026-02-19 19:22:06.572 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:22:06 compute-0 nova_compute[186662]: 2026-02-19 19:22:06.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:22:06 compute-0 nova_compute[186662]: 2026-02-19 19:22:06.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:22:07 compute-0 nova_compute[186662]: 2026-02-19 19:22:07.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:22:07 compute-0 nova_compute[186662]: 2026-02-19 19:22:07.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:22:07 compute-0 nova_compute[186662]: 2026-02-19 19:22:07.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:22:07 compute-0 nova_compute[186662]: 2026-02-19 19:22:07.086 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:22:07 compute-0 nova_compute[186662]: 2026-02-19 19:22:07.232 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:22:07 compute-0 nova_compute[186662]: 2026-02-19 19:22:07.233 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:22:07 compute-0 nova_compute[186662]: 2026-02-19 19:22:07.241 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.009s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:22:07 compute-0 nova_compute[186662]: 2026-02-19 19:22:07.242 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6071MB free_disk=73.01129150390625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:22:07 compute-0 nova_compute[186662]: 2026-02-19 19:22:07.242 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:22:07 compute-0 nova_compute[186662]: 2026-02-19 19:22:07.243 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:22:08 compute-0 nova_compute[186662]: 2026-02-19 19:22:08.281 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:22:08 compute-0 nova_compute[186662]: 2026-02-19 19:22:08.282 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:22:07 up 53 min,  0 user,  load average: 0.01, 0.24, 0.39\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:22:08 compute-0 nova_compute[186662]: 2026-02-19 19:22:08.297 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:22:08 compute-0 nova_compute[186662]: 2026-02-19 19:22:08.805 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:22:09 compute-0 nova_compute[186662]: 2026-02-19 19:22:09.314 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:22:09 compute-0 nova_compute[186662]: 2026-02-19 19:22:09.315 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.073s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:22:11 compute-0 sshd-session[206341]: Received disconnect from 38.102.83.2 port 38658:11: disconnected by user
Feb 19 19:22:11 compute-0 sshd-session[206341]: Disconnected from user zuul 38.102.83.2 port 38658
Feb 19 19:22:11 compute-0 sshd-session[206338]: pam_unix(sshd:session): session closed for user zuul
Feb 19 19:22:11 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Feb 19 19:22:11 compute-0 systemd[1]: session-27.scope: Consumed 5.902s CPU time.
Feb 19 19:22:11 compute-0 systemd-logind[822]: Session 27 logged out. Waiting for processes to exit.
Feb 19 19:22:11 compute-0 systemd-logind[822]: Removed session 27.
Feb 19 19:22:17 compute-0 podman[208118]: 2026-02-19 19:22:17.257366086 +0000 UTC m=+0.038074815 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Feb 19 19:22:19 compute-0 podman[208137]: 2026-02-19 19:22:19.269055073 +0000 UTC m=+0.049297775 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 19 19:22:22 compute-0 podman[208158]: 2026-02-19 19:22:22.301537219 +0000 UTC m=+0.075541615 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 19 19:22:29 compute-0 podman[196025]: time="2026-02-19T19:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:22:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:22:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Feb 19 19:22:31 compute-0 podman[208184]: 2026-02-19 19:22:31.285338891 +0000 UTC m=+0.066864706 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:22:31 compute-0 openstack_network_exporter[198916]: ERROR   19:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:22:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:22:31 compute-0 openstack_network_exporter[198916]: ERROR   19:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:22:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:22:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:22:32.113 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:22:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:22:32.114 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:22:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:22:32.114 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:22:47 compute-0 sshd-session[208210]: error: kex_exchange_identification: read: Connection timed out
Feb 19 19:22:47 compute-0 sshd-session[208210]: banner exchange: Connection from 14.103.86.183 port 58678: Connection timed out
Feb 19 19:22:48 compute-0 podman[208211]: 2026-02-19 19:22:48.30794762 +0000 UTC m=+0.082336796 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 19 19:22:50 compute-0 podman[208232]: 2026-02-19 19:22:50.312260531 +0000 UTC m=+0.088392310 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1770267347, version=9.7, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7)
Feb 19 19:22:53 compute-0 podman[208253]: 2026-02-19 19:22:53.305612698 +0000 UTC m=+0.083035992 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Feb 19 19:22:59 compute-0 podman[196025]: time="2026-02-19T19:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:22:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:22:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Feb 19 19:23:01 compute-0 openstack_network_exporter[198916]: ERROR   19:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:23:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:23:01 compute-0 openstack_network_exporter[198916]: ERROR   19:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:23:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:23:02 compute-0 podman[208280]: 2026-02-19 19:23:02.290741819 +0000 UTC m=+0.061934126 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:23:02 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:02.390 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:23:02 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:02.391 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:23:05 compute-0 nova_compute[186662]: 2026-02-19 19:23:05.316 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:23:05 compute-0 nova_compute[186662]: 2026-02-19 19:23:05.317 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:23:05 compute-0 nova_compute[186662]: 2026-02-19 19:23:05.317 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:23:05 compute-0 nova_compute[186662]: 2026-02-19 19:23:05.317 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:23:05 compute-0 nova_compute[186662]: 2026-02-19 19:23:05.573 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:23:06 compute-0 nova_compute[186662]: 2026-02-19 19:23:06.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:23:06 compute-0 nova_compute[186662]: 2026-02-19 19:23:06.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:23:06 compute-0 nova_compute[186662]: 2026-02-19 19:23:06.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:23:08 compute-0 sshd-session[208305]: Invalid user httpd from 197.211.55.20 port 53600
Feb 19 19:23:08 compute-0 sshd-session[208305]: Received disconnect from 197.211.55.20 port 53600:11: Bye Bye [preauth]
Feb 19 19:23:08 compute-0 sshd-session[208305]: Disconnected from invalid user httpd 197.211.55.20 port 53600 [preauth]
Feb 19 19:23:08 compute-0 nova_compute[186662]: 2026-02-19 19:23:08.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:23:08 compute-0 nova_compute[186662]: 2026-02-19 19:23:08.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:23:09 compute-0 nova_compute[186662]: 2026-02-19 19:23:09.083 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:23:09 compute-0 nova_compute[186662]: 2026-02-19 19:23:09.083 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:23:09 compute-0 nova_compute[186662]: 2026-02-19 19:23:09.083 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:23:09 compute-0 nova_compute[186662]: 2026-02-19 19:23:09.084 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:23:09 compute-0 nova_compute[186662]: 2026-02-19 19:23:09.222 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:23:09 compute-0 nova_compute[186662]: 2026-02-19 19:23:09.223 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:23:09 compute-0 nova_compute[186662]: 2026-02-19 19:23:09.233 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:23:09 compute-0 nova_compute[186662]: 2026-02-19 19:23:09.234 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6068MB free_disk=73.01086044311523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:23:09 compute-0 nova_compute[186662]: 2026-02-19 19:23:09.234 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:23:09 compute-0 nova_compute[186662]: 2026-02-19 19:23:09.235 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:23:10 compute-0 nova_compute[186662]: 2026-02-19 19:23:10.283 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:23:10 compute-0 nova_compute[186662]: 2026-02-19 19:23:10.284 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:23:09 up 54 min,  0 user,  load average: 0.00, 0.19, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:23:10 compute-0 nova_compute[186662]: 2026-02-19 19:23:10.298 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:23:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:10.392 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:23:10 compute-0 nova_compute[186662]: 2026-02-19 19:23:10.805 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:23:11 compute-0 nova_compute[186662]: 2026-02-19 19:23:11.312 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:23:11 compute-0 nova_compute[186662]: 2026-02-19 19:23:11.313 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.078s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:23:13 compute-0 sshd-session[208308]: Received disconnect from 27.50.25.190 port 33150:11: Bye Bye [preauth]
Feb 19 19:23:13 compute-0 sshd-session[208308]: Disconnected from authenticating user root 27.50.25.190 port 33150 [preauth]
Feb 19 19:23:14 compute-0 sshd-session[208310]: Invalid user tpaterni from 189.165.79.177 port 51068
Feb 19 19:23:14 compute-0 sshd-session[208310]: Received disconnect from 189.165.79.177 port 51068:11: Bye Bye [preauth]
Feb 19 19:23:14 compute-0 sshd-session[208310]: Disconnected from invalid user tpaterni 189.165.79.177 port 51068 [preauth]
Feb 19 19:23:19 compute-0 podman[208312]: 2026-02-19 19:23:19.310252574 +0000 UTC m=+0.083611986 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:23:21 compute-0 podman[208332]: 2026-02-19 19:23:21.297753081 +0000 UTC m=+0.077779416 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, vcs-type=git, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, version=9.7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Feb 19 19:23:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:21.425 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:e5:17 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4eb1a9af-e25f-430e-90e4-b3eb0f848576', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d22f3c65ef04151a6b3dc9a9c081278', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94cd3aa9-8714-4558-a152-9028c9a56ec2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=358d6cf6-f591-479e-aedb-25f7e65d226b) old=Port_Binding(mac=['fa:16:3e:cf:e5:17'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4eb1a9af-e25f-430e-90e4-b3eb0f848576', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d22f3c65ef04151a6b3dc9a9c081278', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:23:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:21.426 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 358d6cf6-f591-479e-aedb-25f7e65d226b in datapath 4eb1a9af-e25f-430e-90e4-b3eb0f848576 updated
Feb 19 19:23:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:21.427 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4eb1a9af-e25f-430e-90e4-b3eb0f848576, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:23:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:21.427 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[5da7b316-c43d-4beb-a7c2-64c8d2a35845]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:23:24 compute-0 podman[208353]: 2026-02-19 19:23:24.348563036 +0000 UTC m=+0.122588751 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest)
Feb 19 19:23:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:28.586 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:06:81 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2142a732-49e0-4c6f-b90d-506e79cc5e31', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2142a732-49e0-4c6f-b90d-506e79cc5e31', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e5b0642e4554c3c9ab48a99c057c9f3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63c05d9f-b593-4611-b695-fa702655e6b3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e67f6c22-7076-4b8c-b455-45ee27216753) old=Port_Binding(mac=['fa:16:3e:b8:06:81'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-2142a732-49e0-4c6f-b90d-506e79cc5e31', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2142a732-49e0-4c6f-b90d-506e79cc5e31', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e5b0642e4554c3c9ab48a99c057c9f3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:23:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:28.587 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e67f6c22-7076-4b8c-b455-45ee27216753 in datapath 2142a732-49e0-4c6f-b90d-506e79cc5e31 updated
Feb 19 19:23:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:28.588 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2142a732-49e0-4c6f-b90d-506e79cc5e31, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:23:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:28.589 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ec590f33-8fb6-497a-866d-71de4f0d6fd3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:23:29 compute-0 podman[196025]: time="2026-02-19T19:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:23:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:23:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2154 "" "Go-http-client/1.1"
Feb 19 19:23:31 compute-0 openstack_network_exporter[198916]: ERROR   19:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:23:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:23:31 compute-0 openstack_network_exporter[198916]: ERROR   19:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:23:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:23:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:32.114 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:23:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:32.115 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:23:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:23:32.115 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:23:33 compute-0 podman[208380]: 2026-02-19 19:23:33.291916245 +0000 UTC m=+0.064114057 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:23:50 compute-0 podman[208407]: 2026-02-19 19:23:50.289619554 +0000 UTC m=+0.070407077 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 19 19:23:52 compute-0 podman[208427]: 2026-02-19 19:23:52.291138806 +0000 UTC m=+0.071266978 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, version=9.7, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 19 19:23:55 compute-0 podman[208448]: 2026-02-19 19:23:55.364108491 +0000 UTC m=+0.140726372 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 19 19:23:59 compute-0 podman[196025]: time="2026-02-19T19:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:23:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:23:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2150 "" "Go-http-client/1.1"
Feb 19 19:24:00 compute-0 nova_compute[186662]: 2026-02-19 19:24:00.787 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Acquiring lock "3819a62b-b779-4e11-b72f-0df58e3d54f9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:00 compute-0 nova_compute[186662]: 2026-02-19 19:24:00.788 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:01 compute-0 nova_compute[186662]: 2026-02-19 19:24:01.294 186666 DEBUG nova.compute.manager [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Feb 19 19:24:01 compute-0 openstack_network_exporter[198916]: ERROR   19:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:24:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:24:01 compute-0 openstack_network_exporter[198916]: ERROR   19:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:24:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:24:01 compute-0 nova_compute[186662]: 2026-02-19 19:24:01.881 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:01 compute-0 nova_compute[186662]: 2026-02-19 19:24:01.882 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:01 compute-0 nova_compute[186662]: 2026-02-19 19:24:01.888 186666 DEBUG nova.virt.hardware [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:24:01 compute-0 nova_compute[186662]: 2026-02-19 19:24:01.888 186666 INFO nova.compute.claims [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:24:02 compute-0 nova_compute[186662]: 2026-02-19 19:24:02.932 186666 DEBUG nova.compute.provider_tree [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:24:03 compute-0 nova_compute[186662]: 2026-02-19 19:24:03.438 186666 DEBUG nova.scheduler.client.report [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:24:03 compute-0 nova_compute[186662]: 2026-02-19 19:24:03.948 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.066s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:03 compute-0 nova_compute[186662]: 2026-02-19 19:24:03.949 186666 DEBUG nova.compute.manager [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Feb 19 19:24:04 compute-0 podman[208476]: 2026-02-19 19:24:04.260400871 +0000 UTC m=+0.039943568 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:24:04 compute-0 nova_compute[186662]: 2026-02-19 19:24:04.457 186666 DEBUG nova.compute.manager [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Feb 19 19:24:04 compute-0 nova_compute[186662]: 2026-02-19 19:24:04.458 186666 DEBUG nova.network.neutron [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Feb 19 19:24:04 compute-0 nova_compute[186662]: 2026-02-19 19:24:04.459 186666 WARNING neutronclient.v2_0.client [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:24:04 compute-0 nova_compute[186662]: 2026-02-19 19:24:04.460 186666 WARNING neutronclient.v2_0.client [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:24:04 compute-0 nova_compute[186662]: 2026-02-19 19:24:04.965 186666 INFO nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 19 19:24:05 compute-0 nova_compute[186662]: 2026-02-19 19:24:05.313 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:24:05 compute-0 nova_compute[186662]: 2026-02-19 19:24:05.314 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:24:05 compute-0 nova_compute[186662]: 2026-02-19 19:24:05.314 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:24:05 compute-0 nova_compute[186662]: 2026-02-19 19:24:05.314 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:24:05 compute-0 nova_compute[186662]: 2026-02-19 19:24:05.474 186666 DEBUG nova.compute.manager [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Feb 19 19:24:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:05.520 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:24:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:05.521 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:24:05 compute-0 nova_compute[186662]: 2026-02-19 19:24:05.715 186666 DEBUG nova.network.neutron [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Successfully created port: 636c11f2-6f40-406a-9e50-471026200858 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.492 186666 DEBUG nova.compute.manager [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.494 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.495 186666 INFO nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Creating image(s)
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.495 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Acquiring lock "/var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.496 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "/var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.497 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "/var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.498 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.498 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.659 186666 DEBUG nova.network.neutron [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Successfully updated port: 636c11f2-6f40-406a-9e50-471026200858 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.711 186666 DEBUG nova.compute.manager [req-9b17174b-3002-4181-aa83-2569ab1bd71d req-a0c37df2-e14d-4ed4-9601-eda22751477b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Received event network-changed-636c11f2-6f40-406a-9e50-471026200858 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.712 186666 DEBUG nova.compute.manager [req-9b17174b-3002-4181-aa83-2569ab1bd71d req-a0c37df2-e14d-4ed4-9601-eda22751477b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Refreshing instance network info cache due to event network-changed-636c11f2-6f40-406a-9e50-471026200858. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.712 186666 DEBUG oslo_concurrency.lockutils [req-9b17174b-3002-4181-aa83-2569ab1bd71d req-a0c37df2-e14d-4ed4-9601-eda22751477b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-3819a62b-b779-4e11-b72f-0df58e3d54f9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.713 186666 DEBUG oslo_concurrency.lockutils [req-9b17174b-3002-4181-aa83-2569ab1bd71d req-a0c37df2-e14d-4ed4-9601-eda22751477b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-3819a62b-b779-4e11-b72f-0df58e3d54f9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:24:06 compute-0 nova_compute[186662]: 2026-02-19 19:24:06.713 186666 DEBUG nova.network.neutron [req-9b17174b-3002-4181-aa83-2569ab1bd71d req-a0c37df2-e14d-4ed4-9601-eda22751477b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Refreshing network info cache for port 636c11f2-6f40-406a-9e50-471026200858 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.164 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Acquiring lock "refresh_cache-3819a62b-b779-4e11-b72f-0df58e3d54f9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.219 186666 WARNING neutronclient.v2_0.client [req-9b17174b-3002-4181-aa83-2569ab1bd71d req-a0c37df2-e14d-4ed4-9601-eda22751477b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.454 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.462 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.463 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.521 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c.part --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.523 186666 DEBUG nova.virt.images [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] b8007ea6-afa7-4c5a-abc0-d9d7338ce087 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.525 186666 DEBUG nova.privsep.utils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.525 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c.part /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.713 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c.part /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c.converted" returned: 0 in 0.187s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.715 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.786 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c.converted --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.787 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.289s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.788 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.790 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:24:07 compute-0 nova_compute[186662]: 2026-02-19 19:24:07.791 186666 INFO oslo.privsep.daemon [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpmepxrbxy/privsep.sock']
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.237 186666 DEBUG nova.network.neutron [req-9b17174b-3002-4181-aa83-2569ab1bd71d req-a0c37df2-e14d-4ed4-9601-eda22751477b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.433 186666 INFO oslo.privsep.daemon [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Spawned new privsep daemon via rootwrap
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.301 208520 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.304 208520 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.305 208520 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.305 208520 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208520
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.508 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.547 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.548 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.548 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.550 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.555 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.556 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.598 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.599 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.622 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.623 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.075s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.624 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.676 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.677 186666 DEBUG nova.virt.disk.api [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Checking if we can resize image /var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.677 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.718 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.719 186666 DEBUG nova.virt.disk.api [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Cannot resize image /var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.719 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.719 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Ensure instance console log exists: /var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.720 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.720 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:08 compute-0 nova_compute[186662]: 2026-02-19 19:24:08.720 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:09 compute-0 nova_compute[186662]: 2026-02-19 19:24:09.287 186666 DEBUG nova.network.neutron [req-9b17174b-3002-4181-aa83-2569ab1bd71d req-a0c37df2-e14d-4ed4-9601-eda22751477b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:24:09 compute-0 nova_compute[186662]: 2026-02-19 19:24:09.798 186666 DEBUG oslo_concurrency.lockutils [req-9b17174b-3002-4181-aa83-2569ab1bd71d req-a0c37df2-e14d-4ed4-9601-eda22751477b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-3819a62b-b779-4e11-b72f-0df58e3d54f9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:24:09 compute-0 nova_compute[186662]: 2026-02-19 19:24:09.798 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Acquired lock "refresh_cache-3819a62b-b779-4e11-b72f-0df58e3d54f9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:24:09 compute-0 nova_compute[186662]: 2026-02-19 19:24:09.799 186666 DEBUG nova.network.neutron [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:24:10 compute-0 nova_compute[186662]: 2026-02-19 19:24:10.387 186666 DEBUG nova.network.neutron [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:24:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:10.522 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:24:10 compute-0 nova_compute[186662]: 2026-02-19 19:24:10.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:24:10 compute-0 nova_compute[186662]: 2026-02-19 19:24:10.595 186666 WARNING neutronclient.v2_0.client [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:24:10 compute-0 nova_compute[186662]: 2026-02-19 19:24:10.730 186666 DEBUG nova.network.neutron [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Updating instance_info_cache with network_info: [{"id": "636c11f2-6f40-406a-9e50-471026200858", "address": "fa:16:3e:ea:42:4c", "network": {"id": "4eb1a9af-e25f-430e-90e4-b3eb0f848576", "bridge": "br-int", "label": "tempest-TestDataModel-670056028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d22f3c65ef04151a6b3dc9a9c081278", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap636c11f2-6f", "ovs_interfaceid": "636c11f2-6f40-406a-9e50-471026200858", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.085 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.186 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.187 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.199 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.011s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.199 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5982MB free_disk=72.9764633178711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.200 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.200 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.235 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Releasing lock "refresh_cache-3819a62b-b779-4e11-b72f-0df58e3d54f9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.236 186666 DEBUG nova.compute.manager [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Instance network_info: |[{"id": "636c11f2-6f40-406a-9e50-471026200858", "address": "fa:16:3e:ea:42:4c", "network": {"id": "4eb1a9af-e25f-430e-90e4-b3eb0f848576", "bridge": "br-int", "label": "tempest-TestDataModel-670056028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d22f3c65ef04151a6b3dc9a9c081278", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap636c11f2-6f", "ovs_interfaceid": "636c11f2-6f40-406a-9e50-471026200858", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.238 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Start _get_guest_xml network_info=[{"id": "636c11f2-6f40-406a-9e50-471026200858", "address": "fa:16:3e:ea:42:4c", "network": {"id": "4eb1a9af-e25f-430e-90e4-b3eb0f848576", "bridge": "br-int", "label": "tempest-TestDataModel-670056028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d22f3c65ef04151a6b3dc9a9c081278", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap636c11f2-6f", "ovs_interfaceid": "636c11f2-6f40-406a-9e50-471026200858", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.241 186666 WARNING nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.242 186666 DEBUG nova.virt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestDataModel-server-992543204', uuid='3819a62b-b779-4e11-b72f-0df58e3d54f9'), owner=OwnerMeta(userid='674fe20ddf9a44aeae2b9cdc9b63c3b9', username='tempest-TestDataModel-2087416303-project-admin', projectid='0e5b0642e4554c3c9ab48a99c057c9f3', projectname='tempest-TestDataModel-2087416303'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "636c11f2-6f40-406a-9e50-471026200858", "address": "fa:16:3e:ea:42:4c", "network": {"id": "4eb1a9af-e25f-430e-90e4-b3eb0f848576", "bridge": "br-int", "label": "tempest-TestDataModel-670056028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d22f3c65ef04151a6b3dc9a9c081278", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap636c11f2-6f", "ovs_interfaceid": "636c11f2-6f40-406a-9e50-471026200858", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771529051.2422187) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.245 186666 DEBUG nova.virt.libvirt.host [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.245 186666 DEBUG nova.virt.libvirt.host [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.247 186666 DEBUG nova.virt.libvirt.host [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.248 186666 DEBUG nova.virt.libvirt.host [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.249 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.249 186666 DEBUG nova.virt.hardware [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:33Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.249 186666 DEBUG nova.virt.hardware [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.249 186666 DEBUG nova.virt.hardware [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.249 186666 DEBUG nova.virt.hardware [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.250 186666 DEBUG nova.virt.hardware [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.250 186666 DEBUG nova.virt.hardware [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.250 186666 DEBUG nova.virt.hardware [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.250 186666 DEBUG nova.virt.hardware [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.250 186666 DEBUG nova.virt.hardware [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.251 186666 DEBUG nova.virt.hardware [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.251 186666 DEBUG nova.virt.hardware [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.254 186666 DEBUG nova.privsep.utils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.254 186666 DEBUG nova.virt.libvirt.vif [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:23:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-992543204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-992543204',id=3,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0e5b0642e4554c3c9ab48a99c057c9f3',ramdisk_id='',reservation_id='r-i5sh6sdu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-2087416303',owner_user_name='tempest-TestDataModel-2087416303-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:24:05Z,user_data=None,user_id='674fe20ddf9a44aeae2b9cdc9b63c3b9',uuid=3819a62b-b779-4e11-b72f-0df58e3d54f9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "636c11f2-6f40-406a-9e50-471026200858", "address": "fa:16:3e:ea:42:4c", "network": {"id": "4eb1a9af-e25f-430e-90e4-b3eb0f848576", "bridge": "br-int", "label": "tempest-TestDataModel-670056028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d22f3c65ef04151a6b3dc9a9c081278", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap636c11f2-6f", "ovs_interfaceid": "636c11f2-6f40-406a-9e50-471026200858", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.255 186666 DEBUG nova.network.os_vif_util [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Converting VIF {"id": "636c11f2-6f40-406a-9e50-471026200858", "address": "fa:16:3e:ea:42:4c", "network": {"id": "4eb1a9af-e25f-430e-90e4-b3eb0f848576", "bridge": "br-int", "label": "tempest-TestDataModel-670056028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d22f3c65ef04151a6b3dc9a9c081278", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap636c11f2-6f", "ovs_interfaceid": "636c11f2-6f40-406a-9e50-471026200858", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.256 186666 DEBUG nova.network.os_vif_util [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:42:4c,bridge_name='br-int',has_traffic_filtering=True,id=636c11f2-6f40-406a-9e50-471026200858,network=Network(4eb1a9af-e25f-430e-90e4-b3eb0f848576),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap636c11f2-6f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.257 186666 DEBUG nova.objects.instance [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3819a62b-b779-4e11-b72f-0df58e3d54f9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.762 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:24:11 compute-0 nova_compute[186662]:   <uuid>3819a62b-b779-4e11-b72f-0df58e3d54f9</uuid>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   <name>instance-00000003</name>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   <memory>131072</memory>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <nova:name>tempest-TestDataModel-server-992543204</nova:name>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:24:11</nova:creationTime>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:24:11 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:24:11 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:user uuid="674fe20ddf9a44aeae2b9cdc9b63c3b9">tempest-TestDataModel-2087416303-project-admin</nova:user>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:project uuid="0e5b0642e4554c3c9ab48a99c057c9f3">tempest-TestDataModel-2087416303</nova:project>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         <nova:port uuid="636c11f2-6f40-406a-9e50-471026200858">
Feb 19 19:24:11 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <system>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <entry name="serial">3819a62b-b779-4e11-b72f-0df58e3d54f9</entry>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <entry name="uuid">3819a62b-b779-4e11-b72f-0df58e3d54f9</entry>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     </system>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   <os>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   </os>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   <features>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   </features>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk.config"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:ea:42:4c"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <target dev="tap636c11f2-6f"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/console.log" append="off"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <video>
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     </video>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:24:11 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:24:11 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:24:11 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:24:11 compute-0 nova_compute[186662]: </domain>
Feb 19 19:24:11 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.763 186666 DEBUG nova.compute.manager [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Preparing to wait for external event network-vif-plugged-636c11f2-6f40-406a-9e50-471026200858 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.763 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Acquiring lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.763 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.763 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.764 186666 DEBUG nova.virt.libvirt.vif [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:23:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-992543204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-992543204',id=3,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0e5b0642e4554c3c9ab48a99c057c9f3',ramdisk_id='',reservation_id='r-i5sh6sdu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-2087416303',owner_user_name='tempest-TestDataModel-2087416303-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:24:05Z,user_data=None,user_id='674fe20ddf9a44aeae2b9cdc9b63c3b9',uuid=3819a62b-b779-4e11-b72f-0df58e3d54f9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "636c11f2-6f40-406a-9e50-471026200858", "address": "fa:16:3e:ea:42:4c", "network": {"id": "4eb1a9af-e25f-430e-90e4-b3eb0f848576", "bridge": "br-int", "label": "tempest-TestDataModel-670056028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d22f3c65ef04151a6b3dc9a9c081278", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap636c11f2-6f", "ovs_interfaceid": "636c11f2-6f40-406a-9e50-471026200858", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.764 186666 DEBUG nova.network.os_vif_util [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Converting VIF {"id": "636c11f2-6f40-406a-9e50-471026200858", "address": "fa:16:3e:ea:42:4c", "network": {"id": "4eb1a9af-e25f-430e-90e4-b3eb0f848576", "bridge": "br-int", "label": "tempest-TestDataModel-670056028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d22f3c65ef04151a6b3dc9a9c081278", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap636c11f2-6f", "ovs_interfaceid": "636c11f2-6f40-406a-9e50-471026200858", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.764 186666 DEBUG nova.network.os_vif_util [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:42:4c,bridge_name='br-int',has_traffic_filtering=True,id=636c11f2-6f40-406a-9e50-471026200858,network=Network(4eb1a9af-e25f-430e-90e4-b3eb0f848576),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap636c11f2-6f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.765 186666 DEBUG os_vif [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:42:4c,bridge_name='br-int',has_traffic_filtering=True,id=636c11f2-6f40-406a-9e50-471026200858,network=Network(4eb1a9af-e25f-430e-90e4-b3eb0f848576),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap636c11f2-6f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.797 186666 DEBUG ovsdbapp.backend.ovs_idl [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.798 186666 DEBUG ovsdbapp.backend.ovs_idl [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.798 186666 DEBUG ovsdbapp.backend.ovs_idl [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.799 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.799 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.799 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.800 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.801 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.803 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.810 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.810 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.811 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.812 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.812 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '267864bd-2410-5838-ae98-307b8902fe60', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.813 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.814 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:11 compute-0 nova_compute[186662]: 2026-02-19 19:24:11.814 186666 INFO oslo.privsep.daemon [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpz_zpeepe/privsep.sock']
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.255 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 3819a62b-b779-4e11-b72f-0df58e3d54f9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.255 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.255 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:24:11 up 55 min,  0 user,  load average: 0.08, 0.17, 0.34\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_0e5b0642e4554c3c9ab48a99c057c9f3': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.295 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.510 186666 INFO oslo.privsep.daemon [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Spawned new privsep daemon via rootwrap
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.382 208542 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.386 208542 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.388 208542 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.389 208542 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208542
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.752 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.752 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap636c11f2-6f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.753 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap636c11f2-6f, col_values=(('qos', UUID('b2cdfa0e-cd87-4b61-bdc4-9b823534602a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.754 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap636c11f2-6f, col_values=(('external_ids', {'iface-id': '636c11f2-6f40-406a-9e50-471026200858', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:42:4c', 'vm-uuid': '3819a62b-b779-4e11-b72f-0df58e3d54f9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.755 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:12 compute-0 NetworkManager[56519]: <info>  [1771529052.7559] manager: (tap636c11f2-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.757 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.759 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.760 186666 INFO os_vif [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:42:4c,bridge_name='br-int',has_traffic_filtering=True,id=636c11f2-6f40-406a-9e50-471026200858,network=Network(4eb1a9af-e25f-430e-90e4-b3eb0f848576),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap636c11f2-6f')
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.820 186666 ERROR nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [req-f6d42cac-406a-400a-b686-773a27f0e991] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-f6d42cac-406a-400a-b686-773a27f0e991"}]}
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.834 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing inventories for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.849 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating ProviderTree inventory for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.849 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.860 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing aggregate associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.875 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing trait associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_SSE2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,HW_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_USB _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Feb 19 19:24:12 compute-0 nova_compute[186662]: 2026-02-19 19:24:12.903 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:24:13 compute-0 nova_compute[186662]: 2026-02-19 19:24:13.442 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updated inventory for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Feb 19 19:24:13 compute-0 nova_compute[186662]: 2026-02-19 19:24:13.442 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Feb 19 19:24:13 compute-0 nova_compute[186662]: 2026-02-19 19:24:13.443 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:24:13 compute-0 nova_compute[186662]: 2026-02-19 19:24:13.965 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:24:13 compute-0 nova_compute[186662]: 2026-02-19 19:24:13.965 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.765s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:14 compute-0 nova_compute[186662]: 2026-02-19 19:24:14.292 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:24:14 compute-0 nova_compute[186662]: 2026-02-19 19:24:14.293 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:24:14 compute-0 nova_compute[186662]: 2026-02-19 19:24:14.293 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] No VIF found with MAC fa:16:3e:ea:42:4c, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:24:14 compute-0 nova_compute[186662]: 2026-02-19 19:24:14.293 186666 INFO nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Using config drive
Feb 19 19:24:14 compute-0 nova_compute[186662]: 2026-02-19 19:24:14.802 186666 WARNING neutronclient.v2_0.client [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:24:15 compute-0 nova_compute[186662]: 2026-02-19 19:24:15.320 186666 INFO nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Creating config drive at /var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk.config
Feb 19 19:24:15 compute-0 nova_compute[186662]: 2026-02-19 19:24:15.323 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpbzlrq_92 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:24:15 compute-0 nova_compute[186662]: 2026-02-19 19:24:15.440 186666 DEBUG oslo_concurrency.processutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpbzlrq_92" returned: 0 in 0.116s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:24:15 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Feb 19 19:24:15 compute-0 kernel: tap636c11f2-6f: entered promiscuous mode
Feb 19 19:24:15 compute-0 NetworkManager[56519]: <info>  [1771529055.4954] manager: (tap636c11f2-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Feb 19 19:24:15 compute-0 ovn_controller[96653]: 2026-02-19T19:24:15Z|00040|binding|INFO|Claiming lport 636c11f2-6f40-406a-9e50-471026200858 for this chassis.
Feb 19 19:24:15 compute-0 ovn_controller[96653]: 2026-02-19T19:24:15Z|00041|binding|INFO|636c11f2-6f40-406a-9e50-471026200858: Claiming fa:16:3e:ea:42:4c 10.100.0.10
Feb 19 19:24:15 compute-0 nova_compute[186662]: 2026-02-19 19:24:15.496 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:15 compute-0 nova_compute[186662]: 2026-02-19 19:24:15.499 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:15.507 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:42:4c 10.100.0.10'], port_security=['fa:16:3e:ea:42:4c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '3819a62b-b779-4e11-b72f-0df58e3d54f9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4eb1a9af-e25f-430e-90e4-b3eb0f848576', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e5b0642e4554c3c9ab48a99c057c9f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '19427dbb-d315-45f4-95e1-34612eeef484', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94cd3aa9-8714-4558-a152-9028c9a56ec2, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=636c11f2-6f40-406a-9e50-471026200858) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:24:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:15.507 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 636c11f2-6f40-406a-9e50-471026200858 in datapath 4eb1a9af-e25f-430e-90e4-b3eb0f848576 bound to our chassis
Feb 19 19:24:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:15.510 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4eb1a9af-e25f-430e-90e4-b3eb0f848576
Feb 19 19:24:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:15.531 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[8a93fa3e-b88f-4099-91a3-9fa23f6c7457]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:15.532 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4eb1a9af-e1 in ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:24:15 compute-0 nova_compute[186662]: 2026-02-19 19:24:15.531 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:15 compute-0 systemd-udevd[208571]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:24:15 compute-0 ovn_controller[96653]: 2026-02-19T19:24:15Z|00042|binding|INFO|Setting lport 636c11f2-6f40-406a-9e50-471026200858 ovn-installed in OVS
Feb 19 19:24:15 compute-0 ovn_controller[96653]: 2026-02-19T19:24:15Z|00043|binding|INFO|Setting lport 636c11f2-6f40-406a-9e50-471026200858 up in Southbound
Feb 19 19:24:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:15.536 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4eb1a9af-e0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:24:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:15.536 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e0364fea-83a7-49da-9500-fa1549c29379]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:15 compute-0 nova_compute[186662]: 2026-02-19 19:24:15.536 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:15.537 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[843a651f-4355-4f6d-8b98-afab83a24892]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:15 compute-0 NetworkManager[56519]: <info>  [1771529055.5466] device (tap636c11f2-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:24:15 compute-0 NetworkManager[56519]: <info>  [1771529055.5472] device (tap636c11f2-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:24:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:15.548 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca147da-450d-4770-aa09-cf5ac1fb069e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:15 compute-0 systemd-machined[156014]: New machine qemu-1-instance-00000003.
Feb 19 19:24:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:15.554 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a7d72fcb-35b0-4a9f-abac-4d64ec21179d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:15.555 105986 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpabm6078j/privsep.sock']
Feb 19 19:24:15 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Feb 19 19:24:15 compute-0 nova_compute[186662]: 2026-02-19 19:24:15.695 186666 DEBUG nova.compute.manager [req-82eb6416-b501-4daf-95b0-4d5cf5987295 req-08bdc923-86a5-44f0-bf9f-ddfd280d9d2f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Received event network-vif-plugged-636c11f2-6f40-406a-9e50-471026200858 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:24:15 compute-0 nova_compute[186662]: 2026-02-19 19:24:15.696 186666 DEBUG oslo_concurrency.lockutils [req-82eb6416-b501-4daf-95b0-4d5cf5987295 req-08bdc923-86a5-44f0-bf9f-ddfd280d9d2f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:15 compute-0 nova_compute[186662]: 2026-02-19 19:24:15.696 186666 DEBUG oslo_concurrency.lockutils [req-82eb6416-b501-4daf-95b0-4d5cf5987295 req-08bdc923-86a5-44f0-bf9f-ddfd280d9d2f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:15 compute-0 nova_compute[186662]: 2026-02-19 19:24:15.697 186666 DEBUG oslo_concurrency.lockutils [req-82eb6416-b501-4daf-95b0-4d5cf5987295 req-08bdc923-86a5-44f0-bf9f-ddfd280d9d2f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:15 compute-0 nova_compute[186662]: 2026-02-19 19:24:15.697 186666 DEBUG nova.compute.manager [req-82eb6416-b501-4daf-95b0-4d5cf5987295 req-08bdc923-86a5-44f0-bf9f-ddfd280d9d2f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Processing event network-vif-plugged-636c11f2-6f40-406a-9e50-471026200858 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:24:16 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:16.190 105986 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 19 19:24:16 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:16.190 105986 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpabm6078j/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Feb 19 19:24:16 compute-0 nova_compute[186662]: 2026-02-19 19:24:16.190 186666 DEBUG nova.compute.manager [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:24:16 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:16.077 208593 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 19 19:24:16 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:16.080 208593 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 19 19:24:16 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:16.081 208593 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 19 19:24:16 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:16.081 208593 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208593
Feb 19 19:24:16 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:16.192 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[e2708a8d-5dc7-4748-842c-cc68400054c2]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:16 compute-0 nova_compute[186662]: 2026-02-19 19:24:16.193 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Feb 19 19:24:16 compute-0 nova_compute[186662]: 2026-02-19 19:24:16.204 186666 INFO nova.virt.libvirt.driver [-] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Instance spawned successfully.
Feb 19 19:24:16 compute-0 nova_compute[186662]: 2026-02-19 19:24:16.205 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Feb 19 19:24:16 compute-0 nova_compute[186662]: 2026-02-19 19:24:16.543 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:16 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:16.646 208593 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:16 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:16.647 208593 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:16 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:16.647 208593 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:16 compute-0 nova_compute[186662]: 2026-02-19 19:24:16.716 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:24:16 compute-0 nova_compute[186662]: 2026-02-19 19:24:16.717 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:24:16 compute-0 nova_compute[186662]: 2026-02-19 19:24:16.718 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:24:16 compute-0 nova_compute[186662]: 2026-02-19 19:24:16.719 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:24:16 compute-0 nova_compute[186662]: 2026-02-19 19:24:16.719 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:24:16 compute-0 nova_compute[186662]: 2026-02-19 19:24:16.720 186666 DEBUG nova.virt.libvirt.driver [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.091 208593 INFO oslo_service.backend [-] Loading backend: eventlet
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.096 208593 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.159 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a02389-7738-47e5-9dc4-5932d5d3093b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.174 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[fcde1964-7f74-44fd-a2f5-ea02971823ff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:17 compute-0 NetworkManager[56519]: <info>  [1771529057.1747] manager: (tap4eb1a9af-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Feb 19 19:24:17 compute-0 systemd-udevd[208573]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.198 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[8cf31c95-b001-43a2-a3ad-be9c57ef43d8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.200 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[e550ae7c-e522-4a75-a6d6-6e492c7a9209]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:17 compute-0 NetworkManager[56519]: <info>  [1771529057.2176] device (tap4eb1a9af-e0): carrier: link connected
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.222 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[077fd2e8-fbf6-4ae1-b014-ad1e37284d1e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.232 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f1d10d44-907d-4f85-a8f8-b3514fd4ef6b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4eb1a9af-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:e5:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 332678, 'reachable_time': 36250, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208626, 'error': None, 'target': 'ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.231 186666 INFO nova.compute.manager [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Took 10.74 seconds to spawn the instance on the hypervisor.
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.232 186666 DEBUG nova.compute.manager [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.241 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7d5dbb9c-5b50-4a22-b284-04c7e3632918]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:e517'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 332678, 'tstamp': 332678}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208627, 'error': None, 'target': 'ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.247 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[6dedb81c-74de-447a-98cd-258bbdc1d341]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4eb1a9af-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:e5:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 332678, 'reachable_time': 36250, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208628, 'error': None, 'target': 'ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.264 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[bca7c3fd-34b8-46c9-a2e9-2118c2a182f8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.299 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[239d47e3-8a42-4266-b8d1-fc1fb6a20bcc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.300 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4eb1a9af-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.301 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.301 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4eb1a9af-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.302 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:17 compute-0 NetworkManager[56519]: <info>  [1771529057.3035] manager: (tap4eb1a9af-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Feb 19 19:24:17 compute-0 kernel: tap4eb1a9af-e0: entered promiscuous mode
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.306 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4eb1a9af-e0, col_values=(('external_ids', {'iface-id': '358d6cf6-f591-479e-aedb-25f7e65d226b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.307 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:17 compute-0 ovn_controller[96653]: 2026-02-19T19:24:17Z|00044|binding|INFO|Releasing lport 358d6cf6-f591-479e-aedb-25f7e65d226b from this chassis (sb_readonly=0)
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.307 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.309 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[444da718-aea1-434f-8094-aa86ad19872a]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.310 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4eb1a9af-e25f-430e-90e4-b3eb0f848576.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4eb1a9af-e25f-430e-90e4-b3eb0f848576.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.310 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4eb1a9af-e25f-430e-90e4-b3eb0f848576.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4eb1a9af-e25f-430e-90e4-b3eb0f848576.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.310 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 4eb1a9af-e25f-430e-90e4-b3eb0f848576 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.311 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.312 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4eb1a9af-e25f-430e-90e4-b3eb0f848576.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4eb1a9af-e25f-430e-90e4-b3eb0f848576.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.312 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[464c3f21-9150-4c1c-96ae-31a21b42fe4d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.312 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4eb1a9af-e25f-430e-90e4-b3eb0f848576.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4eb1a9af-e25f-430e-90e4-b3eb0f848576.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.313 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[afb63659-c85c-43e8-b161-ad38e6b557e9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.313 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-4eb1a9af-e25f-430e-90e4-b3eb0f848576
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/4eb1a9af-e25f-430e-90e4-b3eb0f848576.pid.haproxy
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID 4eb1a9af-e25f-430e-90e4-b3eb0f848576
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:24:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:17.313 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576', 'env', 'PROCESS_TAG=haproxy-4eb1a9af-e25f-430e-90e4-b3eb0f848576', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4eb1a9af-e25f-430e-90e4-b3eb0f848576.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:24:17 compute-0 podman[208661]: 2026-02-19 19:24:17.666278403 +0000 UTC m=+0.052338154 container create b92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 19 19:24:17 compute-0 systemd[1]: Started libpod-conmon-b92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252.scope.
Feb 19 19:24:17 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:24:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97391cb91c17b868d1f99be92cb292681ce389c2974867e34764d72316165c09/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:24:17 compute-0 podman[208661]: 2026-02-19 19:24:17.64149142 +0000 UTC m=+0.027551171 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:24:17 compute-0 podman[208661]: 2026-02-19 19:24:17.733057072 +0000 UTC m=+0.119116823 container init b92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 19 19:24:17 compute-0 podman[208661]: 2026-02-19 19:24:17.736567255 +0000 UTC m=+0.122627006 container start b92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.740 186666 DEBUG nova.compute.manager [req-e47c5a99-75fc-4985-b7b9-e273e83585c0 req-877de6f5-2396-4f8d-a70f-40a473be9aa8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Received event network-vif-plugged-636c11f2-6f40-406a-9e50-471026200858 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.740 186666 DEBUG oslo_concurrency.lockutils [req-e47c5a99-75fc-4985-b7b9-e273e83585c0 req-877de6f5-2396-4f8d-a70f-40a473be9aa8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.740 186666 DEBUG oslo_concurrency.lockutils [req-e47c5a99-75fc-4985-b7b9-e273e83585c0 req-877de6f5-2396-4f8d-a70f-40a473be9aa8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.741 186666 DEBUG oslo_concurrency.lockutils [req-e47c5a99-75fc-4985-b7b9-e273e83585c0 req-877de6f5-2396-4f8d-a70f-40a473be9aa8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.741 186666 DEBUG nova.compute.manager [req-e47c5a99-75fc-4985-b7b9-e273e83585c0 req-877de6f5-2396-4f8d-a70f-40a473be9aa8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] No waiting events found dispatching network-vif-plugged-636c11f2-6f40-406a-9e50-471026200858 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.741 186666 WARNING nova.compute.manager [req-e47c5a99-75fc-4985-b7b9-e273e83585c0 req-877de6f5-2396-4f8d-a70f-40a473be9aa8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Received unexpected event network-vif-plugged-636c11f2-6f40-406a-9e50-471026200858 for instance with vm_state active and task_state None.
Feb 19 19:24:17 compute-0 neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576[208676]: [NOTICE]   (208680) : New worker (208682) forked
Feb 19 19:24:17 compute-0 neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576[208676]: [NOTICE]   (208680) : Loading success.
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.766 186666 INFO nova.compute.manager [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Took 15.96 seconds to build instance.
Feb 19 19:24:17 compute-0 nova_compute[186662]: 2026-02-19 19:24:17.799 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:18 compute-0 nova_compute[186662]: 2026-02-19 19:24:18.270 186666 DEBUG oslo_concurrency.lockutils [None req-1321b9da-8eea-492b-b2c4-d61d6ba7bca0 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.482s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:21 compute-0 podman[208691]: 2026-02-19 19:24:21.275273542 +0000 UTC m=+0.050381397 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Feb 19 19:24:21 compute-0 nova_compute[186662]: 2026-02-19 19:24:21.545 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:22 compute-0 nova_compute[186662]: 2026-02-19 19:24:22.831 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:23 compute-0 podman[208710]: 2026-02-19 19:24:23.275659817 +0000 UTC m=+0.050352317 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, architecture=x86_64, build-date=2026-02-05T04:57:10Z, distribution-scope=public, release=1770267347, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 19 19:24:23 compute-0 nova_compute[186662]: 2026-02-19 19:24:23.766 186666 DEBUG oslo_concurrency.lockutils [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Acquiring lock "3819a62b-b779-4e11-b72f-0df58e3d54f9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:23 compute-0 nova_compute[186662]: 2026-02-19 19:24:23.767 186666 DEBUG oslo_concurrency.lockutils [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:23 compute-0 nova_compute[186662]: 2026-02-19 19:24:23.767 186666 DEBUG oslo_concurrency.lockutils [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Acquiring lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:23 compute-0 nova_compute[186662]: 2026-02-19 19:24:23.767 186666 DEBUG oslo_concurrency.lockutils [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:23 compute-0 nova_compute[186662]: 2026-02-19 19:24:23.767 186666 DEBUG oslo_concurrency.lockutils [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:23 compute-0 nova_compute[186662]: 2026-02-19 19:24:23.779 186666 INFO nova.compute.manager [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Terminating instance
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.293 186666 DEBUG nova.compute.manager [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:24:24 compute-0 kernel: tap636c11f2-6f (unregistering): left promiscuous mode
Feb 19 19:24:24 compute-0 NetworkManager[56519]: <info>  [1771529064.3142] device (tap636c11f2-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:24:24 compute-0 ovn_controller[96653]: 2026-02-19T19:24:24Z|00045|binding|INFO|Releasing lport 636c11f2-6f40-406a-9e50-471026200858 from this chassis (sb_readonly=0)
Feb 19 19:24:24 compute-0 ovn_controller[96653]: 2026-02-19T19:24:24Z|00046|binding|INFO|Setting lport 636c11f2-6f40-406a-9e50-471026200858 down in Southbound
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.319 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:24 compute-0 ovn_controller[96653]: 2026-02-19T19:24:24Z|00047|binding|INFO|Removing iface tap636c11f2-6f ovn-installed in OVS
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.321 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.326 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.328 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:42:4c 10.100.0.10'], port_security=['fa:16:3e:ea:42:4c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '3819a62b-b779-4e11-b72f-0df58e3d54f9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4eb1a9af-e25f-430e-90e4-b3eb0f848576', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e5b0642e4554c3c9ab48a99c057c9f3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '19427dbb-d315-45f4-95e1-34612eeef484', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94cd3aa9-8714-4558-a152-9028c9a56ec2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=636c11f2-6f40-406a-9e50-471026200858) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.329 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 636c11f2-6f40-406a-9e50-471026200858 in datapath 4eb1a9af-e25f-430e-90e4-b3eb0f848576 unbound from our chassis
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.330 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4eb1a9af-e25f-430e-90e4-b3eb0f848576, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.331 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[20041891-829a-4301-a0f6-90d0858a21d9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.331 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576 namespace which is not needed anymore
Feb 19 19:24:24 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Feb 19 19:24:24 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 8.776s CPU time.
Feb 19 19:24:24 compute-0 systemd-machined[156014]: Machine qemu-1-instance-00000003 terminated.
Feb 19 19:24:24 compute-0 neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576[208676]: [NOTICE]   (208680) : haproxy version is 3.0.5-8e879a5
Feb 19 19:24:24 compute-0 neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576[208676]: [NOTICE]   (208680) : path to executable is /usr/sbin/haproxy
Feb 19 19:24:24 compute-0 neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576[208676]: [WARNING]  (208680) : Exiting Master process...
Feb 19 19:24:24 compute-0 podman[208757]: 2026-02-19 19:24:24.41605494 +0000 UTC m=+0.028511293 container kill b92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:24:24 compute-0 neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576[208676]: [ALERT]    (208680) : Current worker (208682) exited with code 143 (Terminated)
Feb 19 19:24:24 compute-0 neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576[208676]: [WARNING]  (208680) : All workers exited. Exiting... (0)
Feb 19 19:24:24 compute-0 systemd[1]: libpod-b92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252.scope: Deactivated successfully.
Feb 19 19:24:24 compute-0 podman[208771]: 2026-02-19 19:24:24.447811081 +0000 UTC m=+0.021631969 container died b92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest)
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.453 186666 DEBUG nova.compute.manager [req-26ad76ff-99c0-49c3-a7d1-871b539c4fd8 req-736ae2d7-77a3-4eec-a482-f08abdec07bf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Received event network-vif-unplugged-636c11f2-6f40-406a-9e50-471026200858 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.454 186666 DEBUG oslo_concurrency.lockutils [req-26ad76ff-99c0-49c3-a7d1-871b539c4fd8 req-736ae2d7-77a3-4eec-a482-f08abdec07bf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.454 186666 DEBUG oslo_concurrency.lockutils [req-26ad76ff-99c0-49c3-a7d1-871b539c4fd8 req-736ae2d7-77a3-4eec-a482-f08abdec07bf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.454 186666 DEBUG oslo_concurrency.lockutils [req-26ad76ff-99c0-49c3-a7d1-871b539c4fd8 req-736ae2d7-77a3-4eec-a482-f08abdec07bf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.455 186666 DEBUG nova.compute.manager [req-26ad76ff-99c0-49c3-a7d1-871b539c4fd8 req-736ae2d7-77a3-4eec-a482-f08abdec07bf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] No waiting events found dispatching network-vif-unplugged-636c11f2-6f40-406a-9e50-471026200858 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.455 186666 DEBUG nova.compute.manager [req-26ad76ff-99c0-49c3-a7d1-871b539c4fd8 req-736ae2d7-77a3-4eec-a482-f08abdec07bf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Received event network-vif-unplugged-636c11f2-6f40-406a-9e50-471026200858 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:24:24 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252-userdata-shm.mount: Deactivated successfully.
Feb 19 19:24:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-97391cb91c17b868d1f99be92cb292681ce389c2974867e34764d72316165c09-merged.mount: Deactivated successfully.
Feb 19 19:24:24 compute-0 podman[208771]: 2026-02-19 19:24:24.479717534 +0000 UTC m=+0.053538412 container cleanup b92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Feb 19 19:24:24 compute-0 systemd[1]: libpod-conmon-b92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252.scope: Deactivated successfully.
Feb 19 19:24:24 compute-0 podman[208778]: 2026-02-19 19:24:24.492346337 +0000 UTC m=+0.055101400 container remove b92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.495 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a915ad44-baf7-4b0b-b763-ed0ea977aabb]: (4, ("Thu Feb 19 07:24:24 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576 (b92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252)\nb92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252\nThu Feb 19 07:24:24 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576 (b92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252)\nb92d8901e78a19144b6e24edba5af1fb6d08e18354ba06cb31bda02088ee8252\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.496 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d34bf0bf-f565-47c0-a891-ab70cb0f046e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.497 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4eb1a9af-e25f-430e-90e4-b3eb0f848576.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4eb1a9af-e25f-430e-90e4-b3eb0f848576.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.498 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[5e03bdad-e602-407c-8f51-8e84bfbfb689]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.499 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4eb1a9af-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.500 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.504 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:24 compute-0 kernel: tap4eb1a9af-e0: left promiscuous mode
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.508 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.510 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f3742f0a-cc67-4b35-8445-c13e9e3b5bd8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.524 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[12e86c18-69af-49d1-8e1a-f6920b519bec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.524 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ec628410-9874-400e-95d2-f9dc64101b49]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.526 186666 INFO nova.virt.libvirt.driver [-] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Instance destroyed successfully.
Feb 19 19:24:24 compute-0 nova_compute[186662]: 2026-02-19 19:24:24.527 186666 DEBUG nova.objects.instance [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lazy-loading 'resources' on Instance uuid 3819a62b-b779-4e11-b72f-0df58e3d54f9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.533 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[632e891e-50c3-4ef4-911d-7910b6e8e1de]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 332671, 'reachable_time': 22709, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208820, 'error': None, 'target': 'ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:24 compute-0 systemd[1]: run-netns-ovnmeta\x2d4eb1a9af\x2de25f\x2d430e\x2d90e4\x2db3eb0f848576.mount: Deactivated successfully.
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.536 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4eb1a9af-e25f-430e-90e4-b3eb0f848576 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:24:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:24.537 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[262721fb-3c3e-4a42-bd27-c00e760edf07]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.032 186666 DEBUG nova.virt.libvirt.vif [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:23:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-992543204',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-992543204',id=3,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:24:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0e5b0642e4554c3c9ab48a99c057c9f3',ramdisk_id='',reservation_id='r-i5sh6sdu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestDataModel-2087416303',owner_user_name='tempest-TestDataModel-2087416303-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:24:17Z,user_data=None,user_id='674fe20ddf9a44aeae2b9cdc9b63c3b9',uuid=3819a62b-b779-4e11-b72f-0df58e3d54f9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "636c11f2-6f40-406a-9e50-471026200858", "address": "fa:16:3e:ea:42:4c", "network": {"id": "4eb1a9af-e25f-430e-90e4-b3eb0f848576", "bridge": "br-int", "label": "tempest-TestDataModel-670056028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d22f3c65ef04151a6b3dc9a9c081278", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap636c11f2-6f", "ovs_interfaceid": "636c11f2-6f40-406a-9e50-471026200858", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.033 186666 DEBUG nova.network.os_vif_util [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Converting VIF {"id": "636c11f2-6f40-406a-9e50-471026200858", "address": "fa:16:3e:ea:42:4c", "network": {"id": "4eb1a9af-e25f-430e-90e4-b3eb0f848576", "bridge": "br-int", "label": "tempest-TestDataModel-670056028-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d22f3c65ef04151a6b3dc9a9c081278", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap636c11f2-6f", "ovs_interfaceid": "636c11f2-6f40-406a-9e50-471026200858", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.033 186666 DEBUG nova.network.os_vif_util [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:42:4c,bridge_name='br-int',has_traffic_filtering=True,id=636c11f2-6f40-406a-9e50-471026200858,network=Network(4eb1a9af-e25f-430e-90e4-b3eb0f848576),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap636c11f2-6f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.034 186666 DEBUG os_vif [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:42:4c,bridge_name='br-int',has_traffic_filtering=True,id=636c11f2-6f40-406a-9e50-471026200858,network=Network(4eb1a9af-e25f-430e-90e4-b3eb0f848576),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap636c11f2-6f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.035 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.036 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap636c11f2-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.037 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.039 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.041 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.041 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b2cdfa0e-cd87-4b61-bdc4-9b823534602a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.043 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.044 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.046 186666 INFO os_vif [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:42:4c,bridge_name='br-int',has_traffic_filtering=True,id=636c11f2-6f40-406a-9e50-471026200858,network=Network(4eb1a9af-e25f-430e-90e4-b3eb0f848576),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap636c11f2-6f')
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.046 186666 INFO nova.virt.libvirt.driver [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Deleting instance files /var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9_del
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.047 186666 INFO nova.virt.libvirt.driver [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Deletion of /var/lib/nova/instances/3819a62b-b779-4e11-b72f-0df58e3d54f9_del complete
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.555 186666 INFO nova.compute.manager [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Took 1.26 seconds to destroy the instance on the hypervisor.
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.556 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.557 186666 DEBUG nova.compute.manager [-] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.557 186666 DEBUG nova.network.neutron [-] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.557 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:24:25 compute-0 podman[208825]: 2026-02-19 19:24:25.674770998 +0000 UTC m=+0.084332980 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.685 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.964 186666 DEBUG nova.compute.manager [req-030f6444-9232-4a5c-9696-cc4ef8b0af7a req-7abaed0b-ffb7-4629-8438-f58399a4779e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Received event network-vif-deleted-636c11f2-6f40-406a-9e50-471026200858 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.964 186666 INFO nova.compute.manager [req-030f6444-9232-4a5c-9696-cc4ef8b0af7a req-7abaed0b-ffb7-4629-8438-f58399a4779e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Neutron deleted interface 636c11f2-6f40-406a-9e50-471026200858; detaching it from the instance and deleting it from the info cache
Feb 19 19:24:25 compute-0 nova_compute[186662]: 2026-02-19 19:24:25.965 186666 DEBUG nova.network.neutron [req-030f6444-9232-4a5c-9696-cc4ef8b0af7a req-7abaed0b-ffb7-4629-8438-f58399a4779e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:24:26 compute-0 nova_compute[186662]: 2026-02-19 19:24:26.408 186666 DEBUG nova.network.neutron [-] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:24:26 compute-0 nova_compute[186662]: 2026-02-19 19:24:26.470 186666 DEBUG nova.compute.manager [req-030f6444-9232-4a5c-9696-cc4ef8b0af7a req-7abaed0b-ffb7-4629-8438-f58399a4779e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Detach interface failed, port_id=636c11f2-6f40-406a-9e50-471026200858, reason: Instance 3819a62b-b779-4e11-b72f-0df58e3d54f9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:24:26 compute-0 nova_compute[186662]: 2026-02-19 19:24:26.511 186666 DEBUG nova.compute.manager [req-fad11263-d5e5-4167-b464-40ddcce4576b req-95f7d78c-46fe-4807-bf23-f67a548c67d1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Received event network-vif-unplugged-636c11f2-6f40-406a-9e50-471026200858 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:24:26 compute-0 nova_compute[186662]: 2026-02-19 19:24:26.512 186666 DEBUG oslo_concurrency.lockutils [req-fad11263-d5e5-4167-b464-40ddcce4576b req-95f7d78c-46fe-4807-bf23-f67a548c67d1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:26 compute-0 nova_compute[186662]: 2026-02-19 19:24:26.512 186666 DEBUG oslo_concurrency.lockutils [req-fad11263-d5e5-4167-b464-40ddcce4576b req-95f7d78c-46fe-4807-bf23-f67a548c67d1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:26 compute-0 nova_compute[186662]: 2026-02-19 19:24:26.512 186666 DEBUG oslo_concurrency.lockutils [req-fad11263-d5e5-4167-b464-40ddcce4576b req-95f7d78c-46fe-4807-bf23-f67a548c67d1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:26 compute-0 nova_compute[186662]: 2026-02-19 19:24:26.512 186666 DEBUG nova.compute.manager [req-fad11263-d5e5-4167-b464-40ddcce4576b req-95f7d78c-46fe-4807-bf23-f67a548c67d1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] No waiting events found dispatching network-vif-unplugged-636c11f2-6f40-406a-9e50-471026200858 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:24:26 compute-0 nova_compute[186662]: 2026-02-19 19:24:26.512 186666 DEBUG nova.compute.manager [req-fad11263-d5e5-4167-b464-40ddcce4576b req-95f7d78c-46fe-4807-bf23-f67a548c67d1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Received event network-vif-unplugged-636c11f2-6f40-406a-9e50-471026200858 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:24:26 compute-0 nova_compute[186662]: 2026-02-19 19:24:26.546 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:26 compute-0 nova_compute[186662]: 2026-02-19 19:24:26.913 186666 INFO nova.compute.manager [-] [instance: 3819a62b-b779-4e11-b72f-0df58e3d54f9] Took 1.36 seconds to deallocate network for instance.
Feb 19 19:24:27 compute-0 nova_compute[186662]: 2026-02-19 19:24:27.440 186666 DEBUG oslo_concurrency.lockutils [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:27 compute-0 nova_compute[186662]: 2026-02-19 19:24:27.440 186666 DEBUG oslo_concurrency.lockutils [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:27 compute-0 nova_compute[186662]: 2026-02-19 19:24:27.481 186666 DEBUG nova.compute.provider_tree [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:24:27 compute-0 nova_compute[186662]: 2026-02-19 19:24:27.993 186666 DEBUG nova.scheduler.client.report [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:24:28 compute-0 nova_compute[186662]: 2026-02-19 19:24:28.606 186666 DEBUG oslo_concurrency.lockutils [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.166s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:28 compute-0 nova_compute[186662]: 2026-02-19 19:24:28.721 186666 INFO nova.scheduler.client.report [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Deleted allocations for instance 3819a62b-b779-4e11-b72f-0df58e3d54f9
Feb 19 19:24:29 compute-0 podman[196025]: time="2026-02-19T19:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:24:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:24:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2160 "" "Go-http-client/1.1"
Feb 19 19:24:29 compute-0 nova_compute[186662]: 2026-02-19 19:24:29.926 186666 DEBUG oslo_concurrency.lockutils [None req-903cd02f-1641-47a4-99bf-9102725fa77f 674fe20ddf9a44aeae2b9cdc9b63c3b9 0e5b0642e4554c3c9ab48a99c057c9f3 - - default default] Lock "3819a62b-b779-4e11-b72f-0df58e3d54f9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.159s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:30 compute-0 nova_compute[186662]: 2026-02-19 19:24:30.043 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:31 compute-0 openstack_network_exporter[198916]: ERROR   19:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:24:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:24:31 compute-0 openstack_network_exporter[198916]: ERROR   19:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:24:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:24:31 compute-0 nova_compute[186662]: 2026-02-19 19:24:31.549 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:32.116 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:24:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:32.117 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:24:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:32.117 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:24:35 compute-0 nova_compute[186662]: 2026-02-19 19:24:35.045 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:35 compute-0 podman[208852]: 2026-02-19 19:24:35.289497601 +0000 UTC m=+0.067348964 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:24:36 compute-0 nova_compute[186662]: 2026-02-19 19:24:36.604 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:40 compute-0 nova_compute[186662]: 2026-02-19 19:24:40.049 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:41 compute-0 nova_compute[186662]: 2026-02-19 19:24:41.648 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:41 compute-0 nova_compute[186662]: 2026-02-19 19:24:41.715 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:45 compute-0 nova_compute[186662]: 2026-02-19 19:24:45.095 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:46 compute-0 nova_compute[186662]: 2026-02-19 19:24:46.651 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:50 compute-0 nova_compute[186662]: 2026-02-19 19:24:50.097 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:51 compute-0 nova_compute[186662]: 2026-02-19 19:24:51.684 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:52 compute-0 podman[208876]: 2026-02-19 19:24:52.264756204 +0000 UTC m=+0.040989662 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 19 19:24:54 compute-0 podman[208896]: 2026-02-19 19:24:54.264575853 +0000 UTC m=+0.045113441 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, distribution-scope=public, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1770267347, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc.)
Feb 19 19:24:55 compute-0 nova_compute[186662]: 2026-02-19 19:24:55.099 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:55.441 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:f0:d6 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0de9f29fcea461c9f09c667b54fe8fe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46bbe139-eec8-4de4-bfdc-b507967fb452, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=25b16785-ed71-4ba6-a91b-c4dcee5ff875) old=Port_Binding(mac=['fa:16:3e:46:f0:d6'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0de9f29fcea461c9f09c667b54fe8fe', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:24:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:55.442 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 25b16785-ed71-4ba6-a91b-c4dcee5ff875 in datapath 23744514-9581-483b-ba8d-38106bcd89ef updated
Feb 19 19:24:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:55.442 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 23744514-9581-483b-ba8d-38106bcd89ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:24:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:24:55.443 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b2054c-ee5d-4d10-a25d-1bb405e4c3a3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:24:56 compute-0 podman[208918]: 2026-02-19 19:24:56.296777532 +0000 UTC m=+0.073611864 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Feb 19 19:24:56 compute-0 nova_compute[186662]: 2026-02-19 19:24:56.686 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:24:59 compute-0 podman[196025]: time="2026-02-19T19:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:24:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:24:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2163 "" "Go-http-client/1.1"
Feb 19 19:25:00 compute-0 nova_compute[186662]: 2026-02-19 19:25:00.100 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:01 compute-0 openstack_network_exporter[198916]: ERROR   19:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:25:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:25:01 compute-0 openstack_network_exporter[198916]: ERROR   19:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:25:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:25:01 compute-0 nova_compute[186662]: 2026-02-19 19:25:01.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:25:01 compute-0 nova_compute[186662]: 2026-02-19 19:25:01.576 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Feb 19 19:25:01 compute-0 nova_compute[186662]: 2026-02-19 19:25:01.687 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:01.899 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:86:f7 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-1277da57-36f0-44ae-b2d6-ec6f1bb23be0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1277da57-36f0-44ae-b2d6-ec6f1bb23be0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5f24d0c-1378-497e-a2f2-47f8c79d6dfd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=73306141-d8f4-48ff-adf1-267b0be9a6eb) old=Port_Binding(mac=['fa:16:3e:c3:86:f7'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-1277da57-36f0-44ae-b2d6-ec6f1bb23be0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1277da57-36f0-44ae-b2d6-ec6f1bb23be0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:25:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:01.900 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 73306141-d8f4-48ff-adf1-267b0be9a6eb in datapath 1277da57-36f0-44ae-b2d6-ec6f1bb23be0 updated
Feb 19 19:25:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:01.901 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1277da57-36f0-44ae-b2d6-ec6f1bb23be0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:25:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:01.902 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d9f28484-9f59-41d3-86cd-bbd5d83f19f9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:02 compute-0 nova_compute[186662]: 2026-02-19 19:25:02.082 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Feb 19 19:25:02 compute-0 sshd-session[208943]: Invalid user httpd from 182.75.216.74 port 57651
Feb 19 19:25:02 compute-0 sshd-session[208943]: Received disconnect from 182.75.216.74 port 57651:11: Bye Bye [preauth]
Feb 19 19:25:02 compute-0 sshd-session[208943]: Disconnected from invalid user httpd 182.75.216.74 port 57651 [preauth]
Feb 19 19:25:03 compute-0 nova_compute[186662]: 2026-02-19 19:25:03.082 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:25:03 compute-0 nova_compute[186662]: 2026-02-19 19:25:03.082 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:25:03 compute-0 nova_compute[186662]: 2026-02-19 19:25:03.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:25:03 compute-0 nova_compute[186662]: 2026-02-19 19:25:03.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:25:03 compute-0 nova_compute[186662]: 2026-02-19 19:25:03.576 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Feb 19 19:25:04 compute-0 nova_compute[186662]: 2026-02-19 19:25:04.084 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:25:05 compute-0 nova_compute[186662]: 2026-02-19 19:25:05.102 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:06 compute-0 podman[208945]: 2026-02-19 19:25:06.286743577 +0000 UTC m=+0.053341758 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:25:06 compute-0 nova_compute[186662]: 2026-02-19 19:25:06.589 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:25:06 compute-0 nova_compute[186662]: 2026-02-19 19:25:06.690 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:07 compute-0 nova_compute[186662]: 2026-02-19 19:25:07.101 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:25:07 compute-0 nova_compute[186662]: 2026-02-19 19:25:07.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:25:08 compute-0 nova_compute[186662]: 2026-02-19 19:25:08.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:25:09 compute-0 nova_compute[186662]: 2026-02-19 19:25:09.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:25:10 compute-0 nova_compute[186662]: 2026-02-19 19:25:10.106 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:10 compute-0 nova_compute[186662]: 2026-02-19 19:25:10.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:25:11 compute-0 nova_compute[186662]: 2026-02-19 19:25:11.692 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:12 compute-0 nova_compute[186662]: 2026-02-19 19:25:12.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:25:13 compute-0 nova_compute[186662]: 2026-02-19 19:25:13.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:25:13 compute-0 nova_compute[186662]: 2026-02-19 19:25:13.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:25:13 compute-0 nova_compute[186662]: 2026-02-19 19:25:13.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:25:13 compute-0 nova_compute[186662]: 2026-02-19 19:25:13.086 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:25:13 compute-0 nova_compute[186662]: 2026-02-19 19:25:13.223 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:25:13 compute-0 nova_compute[186662]: 2026-02-19 19:25:13.225 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:25:13 compute-0 nova_compute[186662]: 2026-02-19 19:25:13.242 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:25:13 compute-0 nova_compute[186662]: 2026-02-19 19:25:13.243 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5863MB free_disk=72.97661590576172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:25:13 compute-0 nova_compute[186662]: 2026-02-19 19:25:13.243 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:25:13 compute-0 nova_compute[186662]: 2026-02-19 19:25:13.244 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:25:14 compute-0 nova_compute[186662]: 2026-02-19 19:25:14.289 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:25:14 compute-0 nova_compute[186662]: 2026-02-19 19:25:14.289 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:25:13 up 56 min,  0 user,  load average: 0.13, 0.18, 0.34\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:25:14 compute-0 nova_compute[186662]: 2026-02-19 19:25:14.311 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:25:14 compute-0 nova_compute[186662]: 2026-02-19 19:25:14.863 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:25:15 compute-0 nova_compute[186662]: 2026-02-19 19:25:15.108 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:15 compute-0 nova_compute[186662]: 2026-02-19 19:25:15.372 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:25:15 compute-0 nova_compute[186662]: 2026-02-19 19:25:15.372 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.129s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:25:15 compute-0 ovn_controller[96653]: 2026-02-19T19:25:15Z|00048|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Feb 19 19:25:16 compute-0 nova_compute[186662]: 2026-02-19 19:25:16.746 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:20 compute-0 nova_compute[186662]: 2026-02-19 19:25:20.110 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:21 compute-0 nova_compute[186662]: 2026-02-19 19:25:21.426 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:21.427 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:25:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:21.428 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:25:21 compute-0 nova_compute[186662]: 2026-02-19 19:25:21.776 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:23 compute-0 podman[208974]: 2026-02-19 19:25:23.264280693 +0000 UTC m=+0.041761212 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, io.buildah.version=1.43.0)
Feb 19 19:25:25 compute-0 nova_compute[186662]: 2026-02-19 19:25:25.112 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:25 compute-0 podman[208995]: 2026-02-19 19:25:25.263291136 +0000 UTC m=+0.044686162 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, version=9.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.buildah.version=1.33.7)
Feb 19 19:25:26 compute-0 nova_compute[186662]: 2026-02-19 19:25:26.815 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:27 compute-0 podman[209017]: 2026-02-19 19:25:27.324591218 +0000 UTC m=+0.095395875 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216)
Feb 19 19:25:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:28.429 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:25:29 compute-0 podman[196025]: time="2026-02-19T19:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:25:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:25:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2150 "" "Go-http-client/1.1"
Feb 19 19:25:30 compute-0 nova_compute[186662]: 2026-02-19 19:25:30.145 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:31 compute-0 openstack_network_exporter[198916]: ERROR   19:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:25:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:25:31 compute-0 openstack_network_exporter[198916]: ERROR   19:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:25:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:25:31 compute-0 nova_compute[186662]: 2026-02-19 19:25:31.817 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:32.118 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:25:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:32.118 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:25:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:32.118 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:25:35 compute-0 nova_compute[186662]: 2026-02-19 19:25:35.147 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:35 compute-0 nova_compute[186662]: 2026-02-19 19:25:35.299 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "72cc675e-4d5d-48c5-8c12-f9a42e168294" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:25:35 compute-0 nova_compute[186662]: 2026-02-19 19:25:35.300 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:25:35 compute-0 nova_compute[186662]: 2026-02-19 19:25:35.805 186666 DEBUG nova.compute.manager [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Feb 19 19:25:36 compute-0 nova_compute[186662]: 2026-02-19 19:25:36.351 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:25:36 compute-0 nova_compute[186662]: 2026-02-19 19:25:36.351 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:25:36 compute-0 nova_compute[186662]: 2026-02-19 19:25:36.357 186666 DEBUG nova.virt.hardware [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:25:36 compute-0 nova_compute[186662]: 2026-02-19 19:25:36.357 186666 INFO nova.compute.claims [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:25:36 compute-0 nova_compute[186662]: 2026-02-19 19:25:36.870 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:37 compute-0 podman[209044]: 2026-02-19 19:25:37.264493606 +0000 UTC m=+0.046190377 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:25:37 compute-0 nova_compute[186662]: 2026-02-19 19:25:37.407 186666 DEBUG nova.compute.provider_tree [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:25:37 compute-0 nova_compute[186662]: 2026-02-19 19:25:37.913 186666 DEBUG nova.scheduler.client.report [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:25:38 compute-0 nova_compute[186662]: 2026-02-19 19:25:38.422 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.071s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:25:38 compute-0 nova_compute[186662]: 2026-02-19 19:25:38.423 186666 DEBUG nova.compute.manager [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Feb 19 19:25:39 compute-0 nova_compute[186662]: 2026-02-19 19:25:39.008 186666 DEBUG nova.compute.manager [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Feb 19 19:25:39 compute-0 nova_compute[186662]: 2026-02-19 19:25:39.009 186666 DEBUG nova.network.neutron [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Feb 19 19:25:39 compute-0 nova_compute[186662]: 2026-02-19 19:25:39.009 186666 WARNING neutronclient.v2_0.client [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:25:39 compute-0 nova_compute[186662]: 2026-02-19 19:25:39.010 186666 WARNING neutronclient.v2_0.client [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:25:39 compute-0 nova_compute[186662]: 2026-02-19 19:25:39.516 186666 INFO nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 19 19:25:39 compute-0 nova_compute[186662]: 2026-02-19 19:25:39.762 186666 DEBUG nova.network.neutron [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Successfully created port: 52549f9d-4d31-4b47-a4ed-063a73c7fd04 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Feb 19 19:25:40 compute-0 nova_compute[186662]: 2026-02-19 19:25:40.022 186666 DEBUG nova.compute.manager [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Feb 19 19:25:40 compute-0 nova_compute[186662]: 2026-02-19 19:25:40.149 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:40 compute-0 nova_compute[186662]: 2026-02-19 19:25:40.382 186666 DEBUG nova.network.neutron [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Successfully updated port: 52549f9d-4d31-4b47-a4ed-063a73c7fd04 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Feb 19 19:25:40 compute-0 nova_compute[186662]: 2026-02-19 19:25:40.453 186666 DEBUG nova.compute.manager [req-74956594-5fe0-4c3b-bb37-2c79191364cb req-710ea44b-e1e5-4967-8a80-fed0aca3c728 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Received event network-changed-52549f9d-4d31-4b47-a4ed-063a73c7fd04 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:25:40 compute-0 nova_compute[186662]: 2026-02-19 19:25:40.454 186666 DEBUG nova.compute.manager [req-74956594-5fe0-4c3b-bb37-2c79191364cb req-710ea44b-e1e5-4967-8a80-fed0aca3c728 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Refreshing instance network info cache due to event network-changed-52549f9d-4d31-4b47-a4ed-063a73c7fd04. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:25:40 compute-0 nova_compute[186662]: 2026-02-19 19:25:40.454 186666 DEBUG oslo_concurrency.lockutils [req-74956594-5fe0-4c3b-bb37-2c79191364cb req-710ea44b-e1e5-4967-8a80-fed0aca3c728 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-72cc675e-4d5d-48c5-8c12-f9a42e168294" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:25:40 compute-0 nova_compute[186662]: 2026-02-19 19:25:40.454 186666 DEBUG oslo_concurrency.lockutils [req-74956594-5fe0-4c3b-bb37-2c79191364cb req-710ea44b-e1e5-4967-8a80-fed0aca3c728 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-72cc675e-4d5d-48c5-8c12-f9a42e168294" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:25:40 compute-0 nova_compute[186662]: 2026-02-19 19:25:40.454 186666 DEBUG nova.network.neutron [req-74956594-5fe0-4c3b-bb37-2c79191364cb req-710ea44b-e1e5-4967-8a80-fed0aca3c728 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Refreshing network info cache for port 52549f9d-4d31-4b47-a4ed-063a73c7fd04 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:25:40 compute-0 nova_compute[186662]: 2026-02-19 19:25:40.899 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "refresh_cache-72cc675e-4d5d-48c5-8c12-f9a42e168294" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:25:40 compute-0 nova_compute[186662]: 2026-02-19 19:25:40.958 186666 WARNING neutronclient.v2_0.client [req-74956594-5fe0-4c3b-bb37-2c79191364cb req-710ea44b-e1e5-4967-8a80-fed0aca3c728 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.036 186666 DEBUG nova.compute.manager [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.037 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.038 186666 INFO nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Creating image(s)
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.039 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "/var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.039 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "/var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.040 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "/var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.040 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.043 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.044 186666 DEBUG nova.network.neutron [req-74956594-5fe0-4c3b-bb37-2c79191364cb req-710ea44b-e1e5-4967-8a80-fed0aca3c728 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.046 186666 DEBUG oslo_concurrency.processutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.087 186666 DEBUG oslo_concurrency.processutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.088 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.088 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.089 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.092 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.092 186666 DEBUG oslo_concurrency.processutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.130 186666 DEBUG oslo_concurrency.processutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.130 186666 DEBUG oslo_concurrency.processutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.157 186666 DEBUG oslo_concurrency.processutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk 1073741824" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.157 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.069s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.158 186666 DEBUG oslo_concurrency.processutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.197 186666 DEBUG oslo_concurrency.processutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.198 186666 DEBUG nova.virt.disk.api [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Checking if we can resize image /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.198 186666 DEBUG oslo_concurrency.processutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.217 186666 DEBUG nova.network.neutron [req-74956594-5fe0-4c3b-bb37-2c79191364cb req-710ea44b-e1e5-4967-8a80-fed0aca3c728 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.241 186666 DEBUG oslo_concurrency.processutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.242 186666 DEBUG nova.virt.disk.api [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Cannot resize image /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.242 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.242 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Ensure instance console log exists: /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.243 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.243 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.243 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.726 186666 DEBUG oslo_concurrency.lockutils [req-74956594-5fe0-4c3b-bb37-2c79191364cb req-710ea44b-e1e5-4967-8a80-fed0aca3c728 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-72cc675e-4d5d-48c5-8c12-f9a42e168294" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.727 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquired lock "refresh_cache-72cc675e-4d5d-48c5-8c12-f9a42e168294" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.727 186666 DEBUG nova.network.neutron [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:25:41 compute-0 nova_compute[186662]: 2026-02-19 19:25:41.901 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:42 compute-0 nova_compute[186662]: 2026-02-19 19:25:42.311 186666 DEBUG nova.network.neutron [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:25:42 compute-0 nova_compute[186662]: 2026-02-19 19:25:42.488 186666 WARNING neutronclient.v2_0.client [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:25:42 compute-0 nova_compute[186662]: 2026-02-19 19:25:42.646 186666 DEBUG nova.network.neutron [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Updating instance_info_cache with network_info: [{"id": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "address": "fa:16:3e:f4:c3:31", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52549f9d-4d", "ovs_interfaceid": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.153 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Releasing lock "refresh_cache-72cc675e-4d5d-48c5-8c12-f9a42e168294" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.154 186666 DEBUG nova.compute.manager [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Instance network_info: |[{"id": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "address": "fa:16:3e:f4:c3:31", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52549f9d-4d", "ovs_interfaceid": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.156 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Start _get_guest_xml network_info=[{"id": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "address": "fa:16:3e:f4:c3:31", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52549f9d-4d", "ovs_interfaceid": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.160 186666 WARNING nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.161 186666 DEBUG nova.virt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1820161035', uuid='72cc675e-4d5d-48c5-8c12-f9a42e168294'), owner=OwnerMeta(userid='af924faf672a45b4b8708466af6eeb12', username='tempest-TestExecuteActionsViaActuator-1567565925-project-admin', projectid='84d22a8926d9401eb98cf092c0899a62', projectname='tempest-TestExecuteActionsViaActuator-1567565925'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "address": "fa:16:3e:f4:c3:31", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52549f9d-4d", "ovs_interfaceid": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771529143.1613376) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.165 186666 DEBUG nova.virt.libvirt.host [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.166 186666 DEBUG nova.virt.libvirt.host [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.169 186666 DEBUG nova.virt.libvirt.host [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.169 186666 DEBUG nova.virt.libvirt.host [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.170 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.171 186666 DEBUG nova.virt.hardware [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:33Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.171 186666 DEBUG nova.virt.hardware [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.172 186666 DEBUG nova.virt.hardware [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.172 186666 DEBUG nova.virt.hardware [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.172 186666 DEBUG nova.virt.hardware [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.172 186666 DEBUG nova.virt.hardware [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.172 186666 DEBUG nova.virt.hardware [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.173 186666 DEBUG nova.virt.hardware [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.173 186666 DEBUG nova.virt.hardware [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.173 186666 DEBUG nova.virt.hardware [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.173 186666 DEBUG nova.virt.hardware [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.177 186666 DEBUG nova.virt.libvirt.vif [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:25:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1820161035',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1820161035',id=5,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-x5x8d73p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:25:40Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=72cc675e-4d5d-48c5-8c12-f9a42e168294,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "address": "fa:16:3e:f4:c3:31", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52549f9d-4d", "ovs_interfaceid": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.178 186666 DEBUG nova.network.os_vif_util [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converting VIF {"id": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "address": "fa:16:3e:f4:c3:31", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52549f9d-4d", "ovs_interfaceid": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.178 186666 DEBUG nova.network.os_vif_util [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:c3:31,bridge_name='br-int',has_traffic_filtering=True,id=52549f9d-4d31-4b47-a4ed-063a73c7fd04,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52549f9d-4d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.179 186666 DEBUG nova.objects.instance [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lazy-loading 'pci_devices' on Instance uuid 72cc675e-4d5d-48c5-8c12-f9a42e168294 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.686 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:25:43 compute-0 nova_compute[186662]:   <uuid>72cc675e-4d5d-48c5-8c12-f9a42e168294</uuid>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   <name>instance-00000005</name>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   <memory>131072</memory>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1820161035</nova:name>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:25:43</nova:creationTime>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:25:43 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:25:43 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:user uuid="af924faf672a45b4b8708466af6eeb12">tempest-TestExecuteActionsViaActuator-1567565925-project-admin</nova:user>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:project uuid="84d22a8926d9401eb98cf092c0899a62">tempest-TestExecuteActionsViaActuator-1567565925</nova:project>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         <nova:port uuid="52549f9d-4d31-4b47-a4ed-063a73c7fd04">
Feb 19 19:25:43 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <system>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <entry name="serial">72cc675e-4d5d-48c5-8c12-f9a42e168294</entry>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <entry name="uuid">72cc675e-4d5d-48c5-8c12-f9a42e168294</entry>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     </system>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   <os>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   </os>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   <features>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   </features>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk.config"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:f4:c3:31"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <target dev="tap52549f9d-4d"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/console.log" append="off"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <video>
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     </video>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:25:43 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:25:43 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:25:43 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:25:43 compute-0 nova_compute[186662]: </domain>
Feb 19 19:25:43 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.688 186666 DEBUG nova.compute.manager [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Preparing to wait for external event network-vif-plugged-52549f9d-4d31-4b47-a4ed-063a73c7fd04 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.688 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.688 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.688 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.689 186666 DEBUG nova.virt.libvirt.vif [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:25:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1820161035',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1820161035',id=5,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-x5x8d73p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:25:40Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=72cc675e-4d5d-48c5-8c12-f9a42e168294,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "address": "fa:16:3e:f4:c3:31", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52549f9d-4d", "ovs_interfaceid": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.689 186666 DEBUG nova.network.os_vif_util [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converting VIF {"id": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "address": "fa:16:3e:f4:c3:31", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52549f9d-4d", "ovs_interfaceid": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.690 186666 DEBUG nova.network.os_vif_util [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:c3:31,bridge_name='br-int',has_traffic_filtering=True,id=52549f9d-4d31-4b47-a4ed-063a73c7fd04,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52549f9d-4d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.690 186666 DEBUG os_vif [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:c3:31,bridge_name='br-int',has_traffic_filtering=True,id=52549f9d-4d31-4b47-a4ed-063a73c7fd04,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52549f9d-4d') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.691 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.691 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.692 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.693 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.693 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e899f667-9a70-5e63-a749-98e4624e1ca2', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.694 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.696 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.696 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.698 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.698 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap52549f9d-4d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.699 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap52549f9d-4d, col_values=(('qos', UUID('ed894571-31e8-44fc-b324-b9e039599c75')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.699 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap52549f9d-4d, col_values=(('external_ids', {'iface-id': '52549f9d-4d31-4b47-a4ed-063a73c7fd04', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:c3:31', 'vm-uuid': '72cc675e-4d5d-48c5-8c12-f9a42e168294'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.700 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:43 compute-0 NetworkManager[56519]: <info>  [1771529143.7013] manager: (tap52549f9d-4d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.702 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.705 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:43 compute-0 nova_compute[186662]: 2026-02-19 19:25:43.706 186666 INFO os_vif [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:c3:31,bridge_name='br-int',has_traffic_filtering=True,id=52549f9d-4d31-4b47-a4ed-063a73c7fd04,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52549f9d-4d')
Feb 19 19:25:45 compute-0 nova_compute[186662]: 2026-02-19 19:25:45.247 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:25:45 compute-0 nova_compute[186662]: 2026-02-19 19:25:45.248 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:25:45 compute-0 nova_compute[186662]: 2026-02-19 19:25:45.249 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] No VIF found with MAC fa:16:3e:f4:c3:31, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:25:45 compute-0 nova_compute[186662]: 2026-02-19 19:25:45.249 186666 INFO nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Using config drive
Feb 19 19:25:45 compute-0 nova_compute[186662]: 2026-02-19 19:25:45.759 186666 WARNING neutronclient.v2_0.client [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:25:45 compute-0 nova_compute[186662]: 2026-02-19 19:25:45.886 186666 INFO nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Creating config drive at /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk.config
Feb 19 19:25:45 compute-0 nova_compute[186662]: 2026-02-19 19:25:45.890 186666 DEBUG oslo_concurrency.processutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpa4a_b3g2 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.008 186666 DEBUG oslo_concurrency.processutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpa4a_b3g2" returned: 0 in 0.119s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:25:46 compute-0 kernel: tap52549f9d-4d: entered promiscuous mode
Feb 19 19:25:46 compute-0 NetworkManager[56519]: <info>  [1771529146.0477] manager: (tap52549f9d-4d): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Feb 19 19:25:46 compute-0 ovn_controller[96653]: 2026-02-19T19:25:46Z|00049|binding|INFO|Claiming lport 52549f9d-4d31-4b47-a4ed-063a73c7fd04 for this chassis.
Feb 19 19:25:46 compute-0 ovn_controller[96653]: 2026-02-19T19:25:46Z|00050|binding|INFO|52549f9d-4d31-4b47-a4ed-063a73c7fd04: Claiming fa:16:3e:f4:c3:31 10.100.0.8
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.049 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.051 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.054 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.061 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:c3:31 10.100.0.8'], port_security=['fa:16:3e:f4:c3:31 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '72cc675e-4d5d-48c5-8c12-f9a42e168294', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '4', 'neutron:security_group_ids': '286c3adc-3464-487a-ae0e-8231188cef8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46bbe139-eec8-4de4-bfdc-b507967fb452, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=52549f9d-4d31-4b47-a4ed-063a73c7fd04) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.062 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 52549f9d-4d31-4b47-a4ed-063a73c7fd04 in datapath 23744514-9581-483b-ba8d-38106bcd89ef bound to our chassis
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.063 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 23744514-9581-483b-ba8d-38106bcd89ef
Feb 19 19:25:46 compute-0 systemd-machined[156014]: New machine qemu-2-instance-00000005.
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.069 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[6dd613d9-eb70-44fd-96d3-be19771ec83c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.070 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap23744514-91 in ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.072 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.073 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap23744514-90 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.073 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[80965244-1e4f-4a7a-a722-24d541cdfa55]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 ovn_controller[96653]: 2026-02-19T19:25:46Z|00051|binding|INFO|Setting lport 52549f9d-4d31-4b47-a4ed-063a73c7fd04 ovn-installed in OVS
Feb 19 19:25:46 compute-0 ovn_controller[96653]: 2026-02-19T19:25:46Z|00052|binding|INFO|Setting lport 52549f9d-4d31-4b47-a4ed-063a73c7fd04 up in Southbound
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.074 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[304145e8-8aa3-4ac8-9aeb-ce5ab4d52ce5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.076 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:46 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.080 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[87a55bdd-539a-4d72-97cb-b263257a72e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.086 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f312cdc8-9a7c-49ec-b921-e81b9a5fb8f1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 systemd-udevd[209106]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:25:46 compute-0 NetworkManager[56519]: <info>  [1771529146.0959] device (tap52549f9d-4d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:25:46 compute-0 NetworkManager[56519]: <info>  [1771529146.0965] device (tap52549f9d-4d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.102 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[6288bc4a-6946-4d09-8e06-f87cf652cf08]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.105 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[5e5df787-e518-4189-88b2-75655220e625]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 NetworkManager[56519]: <info>  [1771529146.1064] manager: (tap23744514-90): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.126 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f8557b-e3db-4c30-b50e-176288e7bb7d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.129 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[64411998-5b08-4962-8874-7c4a88e681cd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 NetworkManager[56519]: <info>  [1771529146.1407] device (tap23744514-90): carrier: link connected
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.144 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[90131145-3b37-408b-bc21-5cb1b5e03059]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.153 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[73ff77bb-5a56-48f8-91de-76732018bae1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23744514-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:f0:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341570, 'reachable_time': 39245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209136, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.162 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[abdb54c5-64eb-4fe9-ab25-99358a1c3edf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe46:f0d6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341570, 'tstamp': 341570}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209137, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.171 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[9b8ae0cb-89df-40b8-80cb-0c9f27e6b21d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23744514-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:f0:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341570, 'reachable_time': 39245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209138, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.188 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[bee0fd87-e078-42b2-9239-1f595a942d32]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.198 186666 DEBUG nova.compute.manager [req-eda603e4-fc79-4212-a4a9-23dd56fe526c req-717f587f-acf7-4242-a61f-60920420572f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Received event network-vif-plugged-52549f9d-4d31-4b47-a4ed-063a73c7fd04 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.199 186666 DEBUG oslo_concurrency.lockutils [req-eda603e4-fc79-4212-a4a9-23dd56fe526c req-717f587f-acf7-4242-a61f-60920420572f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.199 186666 DEBUG oslo_concurrency.lockutils [req-eda603e4-fc79-4212-a4a9-23dd56fe526c req-717f587f-acf7-4242-a61f-60920420572f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.199 186666 DEBUG oslo_concurrency.lockutils [req-eda603e4-fc79-4212-a4a9-23dd56fe526c req-717f587f-acf7-4242-a61f-60920420572f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.199 186666 DEBUG nova.compute.manager [req-eda603e4-fc79-4212-a4a9-23dd56fe526c req-717f587f-acf7-4242-a61f-60920420572f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Processing event network-vif-plugged-52549f9d-4d31-4b47-a4ed-063a73c7fd04 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.224 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d62e6d2a-81b1-47fc-a5ac-8364023d52d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.225 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23744514-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.225 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.225 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23744514-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:25:46 compute-0 NetworkManager[56519]: <info>  [1771529146.2275] manager: (tap23744514-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Feb 19 19:25:46 compute-0 kernel: tap23744514-90: entered promiscuous mode
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.227 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.229 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap23744514-90, col_values=(('external_ids', {'iface-id': '25b16785-ed71-4ba6-a91b-c4dcee5ff875'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:25:46 compute-0 ovn_controller[96653]: 2026-02-19T19:25:46Z|00053|binding|INFO|Releasing lport 25b16785-ed71-4ba6-a91b-c4dcee5ff875 from this chassis (sb_readonly=0)
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.235 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.235 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef64abf-b5cb-4626-8419-7a4a46c9eb43]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.236 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.236 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.236 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 23744514-9581-483b-ba8d-38106bcd89ef disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.236 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.237 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[5ee02e8b-cfe7-4016-bd12-8d9698f3982d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.237 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.237 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7eeb1d50-49d8-484f-93ba-f635e54cdf40]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.238 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-23744514-9581-483b-ba8d-38106bcd89ef
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID 23744514-9581-483b-ba8d-38106bcd89ef
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:25:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:25:46.240 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'env', 'PROCESS_TAG=haproxy-23744514-9581-483b-ba8d-38106bcd89ef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/23744514-9581-483b-ba8d-38106bcd89ef.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.360 186666 DEBUG nova.compute.manager [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.363 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.366 186666 INFO nova.virt.libvirt.driver [-] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Instance spawned successfully.
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.366 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Feb 19 19:25:46 compute-0 podman[209177]: 2026-02-19 19:25:46.548561582 +0000 UTC m=+0.038292667 container create 8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Feb 19 19:25:46 compute-0 systemd[1]: Started libpod-conmon-8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0.scope.
Feb 19 19:25:46 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:25:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4ab769aa4d319dcce232a1f0754f124605a3b2871b4ffe0e89eaf1baa5022e3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:25:46 compute-0 podman[209177]: 2026-02-19 19:25:46.620835443 +0000 UTC m=+0.110566618 container init 8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 19 19:25:46 compute-0 podman[209177]: 2026-02-19 19:25:46.52758564 +0000 UTC m=+0.017316745 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:25:46 compute-0 podman[209177]: 2026-02-19 19:25:46.624761407 +0000 UTC m=+0.114492492 container start 8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:25:46 compute-0 neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef[209192]: [NOTICE]   (209196) : New worker (209198) forked
Feb 19 19:25:46 compute-0 neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef[209192]: [NOTICE]   (209196) : Loading success.
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.876 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.877 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.877 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.877 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.878 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.878 186666 DEBUG nova.virt.libvirt.driver [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:25:46 compute-0 nova_compute[186662]: 2026-02-19 19:25:46.913 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:47 compute-0 nova_compute[186662]: 2026-02-19 19:25:47.387 186666 INFO nova.compute.manager [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Took 6.35 seconds to spawn the instance on the hypervisor.
Feb 19 19:25:47 compute-0 nova_compute[186662]: 2026-02-19 19:25:47.388 186666 DEBUG nova.compute.manager [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:25:47 compute-0 nova_compute[186662]: 2026-02-19 19:25:47.916 186666 INFO nova.compute.manager [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Took 11.60 seconds to build instance.
Feb 19 19:25:48 compute-0 nova_compute[186662]: 2026-02-19 19:25:48.275 186666 DEBUG nova.compute.manager [req-2eceaf6d-e72b-4778-a750-f4c0d34acb24 req-036a8741-6b69-40eb-968b-16555b4ff1ad 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Received event network-vif-plugged-52549f9d-4d31-4b47-a4ed-063a73c7fd04 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:25:48 compute-0 nova_compute[186662]: 2026-02-19 19:25:48.276 186666 DEBUG oslo_concurrency.lockutils [req-2eceaf6d-e72b-4778-a750-f4c0d34acb24 req-036a8741-6b69-40eb-968b-16555b4ff1ad 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:25:48 compute-0 nova_compute[186662]: 2026-02-19 19:25:48.276 186666 DEBUG oslo_concurrency.lockutils [req-2eceaf6d-e72b-4778-a750-f4c0d34acb24 req-036a8741-6b69-40eb-968b-16555b4ff1ad 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:25:48 compute-0 nova_compute[186662]: 2026-02-19 19:25:48.276 186666 DEBUG oslo_concurrency.lockutils [req-2eceaf6d-e72b-4778-a750-f4c0d34acb24 req-036a8741-6b69-40eb-968b-16555b4ff1ad 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:25:48 compute-0 nova_compute[186662]: 2026-02-19 19:25:48.277 186666 DEBUG nova.compute.manager [req-2eceaf6d-e72b-4778-a750-f4c0d34acb24 req-036a8741-6b69-40eb-968b-16555b4ff1ad 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] No waiting events found dispatching network-vif-plugged-52549f9d-4d31-4b47-a4ed-063a73c7fd04 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:25:48 compute-0 nova_compute[186662]: 2026-02-19 19:25:48.277 186666 WARNING nova.compute.manager [req-2eceaf6d-e72b-4778-a750-f4c0d34acb24 req-036a8741-6b69-40eb-968b-16555b4ff1ad 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Received unexpected event network-vif-plugged-52549f9d-4d31-4b47-a4ed-063a73c7fd04 for instance with vm_state active and task_state None.
Feb 19 19:25:48 compute-0 nova_compute[186662]: 2026-02-19 19:25:48.421 186666 DEBUG oslo_concurrency.lockutils [None req-14010860-998f-477c-9718-bbd47c85267c af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.122s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:25:48 compute-0 nova_compute[186662]: 2026-02-19 19:25:48.701 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:51 compute-0 nova_compute[186662]: 2026-02-19 19:25:51.938 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:53 compute-0 nova_compute[186662]: 2026-02-19 19:25:53.704 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:54 compute-0 podman[209207]: 2026-02-19 19:25:54.264418481 +0000 UTC m=+0.043300018 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest)
Feb 19 19:25:56 compute-0 podman[209225]: 2026-02-19 19:25:56.268340569 +0000 UTC m=+0.050145811 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, version=9.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 19 19:25:56 compute-0 nova_compute[186662]: 2026-02-19 19:25:56.979 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:58 compute-0 podman[209261]: 2026-02-19 19:25:58.317345733 +0000 UTC m=+0.096588654 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:25:58 compute-0 nova_compute[186662]: 2026-02-19 19:25:58.705 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:25:59 compute-0 ovn_controller[96653]: 2026-02-19T19:25:59Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f4:c3:31 10.100.0.8
Feb 19 19:25:59 compute-0 ovn_controller[96653]: 2026-02-19T19:25:59Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f4:c3:31 10.100.0.8
Feb 19 19:25:59 compute-0 podman[196025]: time="2026-02-19T19:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:25:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17220 "" "Go-http-client/1.1"
Feb 19 19:25:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2622 "" "Go-http-client/1.1"
Feb 19 19:26:00 compute-0 nova_compute[186662]: 2026-02-19 19:26:00.113 186666 DEBUG nova.compute.manager [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6173
Feb 19 19:26:00 compute-0 nova_compute[186662]: 2026-02-19 19:26:00.643 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:26:00 compute-0 nova_compute[186662]: 2026-02-19 19:26:00.643 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:26:01 compute-0 nova_compute[186662]: 2026-02-19 19:26:01.157 186666 DEBUG nova.virt.hardware [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:26:01 compute-0 nova_compute[186662]: 2026-02-19 19:26:01.158 186666 INFO nova.compute.claims [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:26:01 compute-0 openstack_network_exporter[198916]: ERROR   19:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:26:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:26:01 compute-0 openstack_network_exporter[198916]: ERROR   19:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:26:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:26:01 compute-0 nova_compute[186662]: 2026-02-19 19:26:01.667 186666 INFO nova.compute.resource_tracker [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Updating resource usage from migration 94cd3078-8823-48fe-b500-a06fe34e58fd
Feb 19 19:26:01 compute-0 nova_compute[186662]: 2026-02-19 19:26:01.668 186666 DEBUG nova.compute.resource_tracker [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Starting to track incoming migration 94cd3078-8823-48fe-b500-a06fe34e58fd with flavor be514118-0e97-4b07-bca3-35af980bcd98 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Feb 19 19:26:02 compute-0 nova_compute[186662]: 2026-02-19 19:26:02.036 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:02 compute-0 nova_compute[186662]: 2026-02-19 19:26:02.211 186666 DEBUG nova.compute.provider_tree [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:26:02 compute-0 nova_compute[186662]: 2026-02-19 19:26:02.717 186666 DEBUG nova.scheduler.client.report [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:26:03 compute-0 nova_compute[186662]: 2026-02-19 19:26:03.225 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 2.582s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:26:03 compute-0 nova_compute[186662]: 2026-02-19 19:26:03.225 186666 INFO nova.compute.manager [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Migrating
Feb 19 19:26:03 compute-0 nova_compute[186662]: 2026-02-19 19:26:03.226 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:26:03 compute-0 nova_compute[186662]: 2026-02-19 19:26:03.226 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:26:03 compute-0 nova_compute[186662]: 2026-02-19 19:26:03.707 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:03 compute-0 nova_compute[186662]: 2026-02-19 19:26:03.735 186666 INFO nova.compute.rpcapi [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Automatically selected compute RPC version 6.4 from minimum service version 70
Feb 19 19:26:03 compute-0 nova_compute[186662]: 2026-02-19 19:26:03.736 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:26:05 compute-0 nova_compute[186662]: 2026-02-19 19:26:05.300 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:26:05 compute-0 nova_compute[186662]: 2026-02-19 19:26:05.300 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:26:05 compute-0 nova_compute[186662]: 2026-02-19 19:26:05.809 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Triggering sync for uuid 72cc675e-4d5d-48c5-8c12-f9a42e168294 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Feb 19 19:26:05 compute-0 nova_compute[186662]: 2026-02-19 19:26:05.810 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "72cc675e-4d5d-48c5-8c12-f9a42e168294" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:26:05 compute-0 nova_compute[186662]: 2026-02-19 19:26:05.811 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:26:05 compute-0 nova_compute[186662]: 2026-02-19 19:26:05.811 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:26:05 compute-0 nova_compute[186662]: 2026-02-19 19:26:05.812 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:26:06 compute-0 nova_compute[186662]: 2026-02-19 19:26:06.087 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:26:06 compute-0 nova_compute[186662]: 2026-02-19 19:26:06.321 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.510s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:26:07 compute-0 nova_compute[186662]: 2026-02-19 19:26:07.084 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:07 compute-0 nova_compute[186662]: 2026-02-19 19:26:07.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:26:07 compute-0 sshd-session[209287]: Accepted publickey for nova from 192.168.122.101 port 35244 ssh2: ECDSA SHA256:liNwzzNsv7m7uKbs0CZWtaLfd+C4L5fwY2+6uBHzm7c
Feb 19 19:26:07 compute-0 systemd-logind[822]: New session 28 of user nova.
Feb 19 19:26:07 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 19 19:26:07 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 19 19:26:07 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 19 19:26:07 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 19 19:26:07 compute-0 systemd[209302]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 19 19:26:07 compute-0 podman[209289]: 2026-02-19 19:26:07.957167232 +0000 UTC m=+0.063971210 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 19:26:08 compute-0 systemd[209302]: Queued start job for default target Main User Target.
Feb 19 19:26:08 compute-0 systemd[209302]: Created slice User Application Slice.
Feb 19 19:26:08 compute-0 systemd[209302]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 19 19:26:08 compute-0 systemd[209302]: Started Daily Cleanup of User's Temporary Directories.
Feb 19 19:26:08 compute-0 rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 19:26:08 compute-0 systemd[209302]: Reached target Paths.
Feb 19 19:26:08 compute-0 rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 19:26:08 compute-0 systemd[209302]: Reached target Timers.
Feb 19 19:26:08 compute-0 systemd[209302]: Starting D-Bus User Message Bus Socket...
Feb 19 19:26:08 compute-0 systemd[209302]: Starting Create User's Volatile Files and Directories...
Feb 19 19:26:08 compute-0 systemd[209302]: Finished Create User's Volatile Files and Directories.
Feb 19 19:26:08 compute-0 systemd[209302]: Listening on D-Bus User Message Bus Socket.
Feb 19 19:26:08 compute-0 systemd[209302]: Reached target Sockets.
Feb 19 19:26:08 compute-0 systemd[209302]: Reached target Basic System.
Feb 19 19:26:08 compute-0 systemd[209302]: Reached target Main User Target.
Feb 19 19:26:08 compute-0 systemd[209302]: Startup finished in 116ms.
Feb 19 19:26:08 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 19 19:26:08 compute-0 systemd[1]: Started Session 28 of User nova.
Feb 19 19:26:08 compute-0 sshd-session[209287]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 19 19:26:08 compute-0 sshd-session[209330]: Received disconnect from 192.168.122.101 port 35244:11: disconnected by user
Feb 19 19:26:08 compute-0 sshd-session[209330]: Disconnected from user nova 192.168.122.101 port 35244
Feb 19 19:26:08 compute-0 sshd-session[209287]: pam_unix(sshd:session): session closed for user nova
Feb 19 19:26:08 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Feb 19 19:26:08 compute-0 systemd-logind[822]: Session 28 logged out. Waiting for processes to exit.
Feb 19 19:26:08 compute-0 systemd-logind[822]: Removed session 28.
Feb 19 19:26:08 compute-0 sshd-session[209332]: Accepted publickey for nova from 192.168.122.101 port 35260 ssh2: ECDSA SHA256:liNwzzNsv7m7uKbs0CZWtaLfd+C4L5fwY2+6uBHzm7c
Feb 19 19:26:08 compute-0 systemd-logind[822]: New session 30 of user nova.
Feb 19 19:26:08 compute-0 systemd[1]: Started Session 30 of User nova.
Feb 19 19:26:08 compute-0 sshd-session[209332]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 19 19:26:08 compute-0 sshd-session[209335]: Received disconnect from 192.168.122.101 port 35260:11: disconnected by user
Feb 19 19:26:08 compute-0 sshd-session[209335]: Disconnected from user nova 192.168.122.101 port 35260
Feb 19 19:26:08 compute-0 sshd-session[209332]: pam_unix(sshd:session): session closed for user nova
Feb 19 19:26:08 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Feb 19 19:26:08 compute-0 systemd-logind[822]: Session 30 logged out. Waiting for processes to exit.
Feb 19 19:26:08 compute-0 systemd-logind[822]: Removed session 30.
Feb 19 19:26:08 compute-0 nova_compute[186662]: 2026-02-19 19:26:08.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:26:08 compute-0 nova_compute[186662]: 2026-02-19 19:26:08.710 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:10 compute-0 nova_compute[186662]: 2026-02-19 19:26:10.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:26:10 compute-0 nova_compute[186662]: 2026-02-19 19:26:10.743 186666 DEBUG nova.compute.manager [req-de681587-09e7-4938-b797-0d8cfb5f9c65 req-05c88b52-10f8-4bf6-9b55-61056c3cec87 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received event network-vif-unplugged-e43829dc-7578-4c37-87d3-5cbc96a2767f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:26:10 compute-0 nova_compute[186662]: 2026-02-19 19:26:10.744 186666 DEBUG oslo_concurrency.lockutils [req-de681587-09e7-4938-b797-0d8cfb5f9c65 req-05c88b52-10f8-4bf6-9b55-61056c3cec87 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:26:10 compute-0 nova_compute[186662]: 2026-02-19 19:26:10.744 186666 DEBUG oslo_concurrency.lockutils [req-de681587-09e7-4938-b797-0d8cfb5f9c65 req-05c88b52-10f8-4bf6-9b55-61056c3cec87 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:26:10 compute-0 nova_compute[186662]: 2026-02-19 19:26:10.745 186666 DEBUG oslo_concurrency.lockutils [req-de681587-09e7-4938-b797-0d8cfb5f9c65 req-05c88b52-10f8-4bf6-9b55-61056c3cec87 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:26:10 compute-0 nova_compute[186662]: 2026-02-19 19:26:10.745 186666 DEBUG nova.compute.manager [req-de681587-09e7-4938-b797-0d8cfb5f9c65 req-05c88b52-10f8-4bf6-9b55-61056c3cec87 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] No waiting events found dispatching network-vif-unplugged-e43829dc-7578-4c37-87d3-5cbc96a2767f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:26:10 compute-0 nova_compute[186662]: 2026-02-19 19:26:10.745 186666 WARNING nova.compute.manager [req-de681587-09e7-4938-b797-0d8cfb5f9c65 req-05c88b52-10f8-4bf6-9b55-61056c3cec87 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received unexpected event network-vif-unplugged-e43829dc-7578-4c37-87d3-5cbc96a2767f for instance with vm_state active and task_state resize_migrating.
Feb 19 19:26:11 compute-0 nova_compute[186662]: 2026-02-19 19:26:11.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:26:11 compute-0 sshd-session[209337]: Accepted publickey for nova from 192.168.122.101 port 35276 ssh2: ECDSA SHA256:liNwzzNsv7m7uKbs0CZWtaLfd+C4L5fwY2+6uBHzm7c
Feb 19 19:26:11 compute-0 systemd-logind[822]: New session 31 of user nova.
Feb 19 19:26:11 compute-0 systemd[1]: Started Session 31 of User nova.
Feb 19 19:26:11 compute-0 sshd-session[209337]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 19 19:26:11 compute-0 sshd-session[209340]: Received disconnect from 192.168.122.101 port 35276:11: disconnected by user
Feb 19 19:26:11 compute-0 sshd-session[209340]: Disconnected from user nova 192.168.122.101 port 35276
Feb 19 19:26:12 compute-0 sshd-session[209337]: pam_unix(sshd:session): session closed for user nova
Feb 19 19:26:12 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Feb 19 19:26:12 compute-0 systemd-logind[822]: Session 31 logged out. Waiting for processes to exit.
Feb 19 19:26:12 compute-0 systemd-logind[822]: Removed session 31.
Feb 19 19:26:12 compute-0 sshd-session[209342]: Accepted publickey for nova from 192.168.122.101 port 35278 ssh2: ECDSA SHA256:liNwzzNsv7m7uKbs0CZWtaLfd+C4L5fwY2+6uBHzm7c
Feb 19 19:26:12 compute-0 nova_compute[186662]: 2026-02-19 19:26:12.127 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:12 compute-0 systemd-logind[822]: New session 32 of user nova.
Feb 19 19:26:12 compute-0 systemd[1]: Started Session 32 of User nova.
Feb 19 19:26:12 compute-0 sshd-session[209342]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 19 19:26:12 compute-0 sshd-session[209345]: Received disconnect from 192.168.122.101 port 35278:11: disconnected by user
Feb 19 19:26:12 compute-0 sshd-session[209345]: Disconnected from user nova 192.168.122.101 port 35278
Feb 19 19:26:12 compute-0 sshd-session[209342]: pam_unix(sshd:session): session closed for user nova
Feb 19 19:26:12 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Feb 19 19:26:12 compute-0 systemd-logind[822]: Session 32 logged out. Waiting for processes to exit.
Feb 19 19:26:12 compute-0 systemd-logind[822]: Removed session 32.
Feb 19 19:26:12 compute-0 sshd-session[209347]: Accepted publickey for nova from 192.168.122.101 port 32894 ssh2: ECDSA SHA256:liNwzzNsv7m7uKbs0CZWtaLfd+C4L5fwY2+6uBHzm7c
Feb 19 19:26:12 compute-0 systemd-logind[822]: New session 33 of user nova.
Feb 19 19:26:12 compute-0 systemd[1]: Started Session 33 of User nova.
Feb 19 19:26:12 compute-0 sshd-session[209347]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 19 19:26:12 compute-0 sshd-session[209350]: Received disconnect from 192.168.122.101 port 32894:11: disconnected by user
Feb 19 19:26:12 compute-0 sshd-session[209350]: Disconnected from user nova 192.168.122.101 port 32894
Feb 19 19:26:12 compute-0 sshd-session[209347]: pam_unix(sshd:session): session closed for user nova
Feb 19 19:26:12 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Feb 19 19:26:12 compute-0 systemd-logind[822]: Session 33 logged out. Waiting for processes to exit.
Feb 19 19:26:12 compute-0 systemd-logind[822]: Removed session 33.
Feb 19 19:26:12 compute-0 nova_compute[186662]: 2026-02-19 19:26:12.840 186666 DEBUG nova.compute.manager [req-3b05199c-522b-4d4b-96fb-29784141bc4e req-56097edf-39bf-4a08-a2ed-9b827f50bacc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received event network-vif-unplugged-e43829dc-7578-4c37-87d3-5cbc96a2767f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:26:12 compute-0 nova_compute[186662]: 2026-02-19 19:26:12.841 186666 DEBUG oslo_concurrency.lockutils [req-3b05199c-522b-4d4b-96fb-29784141bc4e req-56097edf-39bf-4a08-a2ed-9b827f50bacc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:26:12 compute-0 nova_compute[186662]: 2026-02-19 19:26:12.841 186666 DEBUG oslo_concurrency.lockutils [req-3b05199c-522b-4d4b-96fb-29784141bc4e req-56097edf-39bf-4a08-a2ed-9b827f50bacc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:26:12 compute-0 nova_compute[186662]: 2026-02-19 19:26:12.841 186666 DEBUG oslo_concurrency.lockutils [req-3b05199c-522b-4d4b-96fb-29784141bc4e req-56097edf-39bf-4a08-a2ed-9b827f50bacc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:26:12 compute-0 nova_compute[186662]: 2026-02-19 19:26:12.842 186666 DEBUG nova.compute.manager [req-3b05199c-522b-4d4b-96fb-29784141bc4e req-56097edf-39bf-4a08-a2ed-9b827f50bacc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] No waiting events found dispatching network-vif-unplugged-e43829dc-7578-4c37-87d3-5cbc96a2767f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:26:12 compute-0 nova_compute[186662]: 2026-02-19 19:26:12.842 186666 WARNING nova.compute.manager [req-3b05199c-522b-4d4b-96fb-29784141bc4e req-56097edf-39bf-4a08-a2ed-9b827f50bacc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received unexpected event network-vif-unplugged-e43829dc-7578-4c37-87d3-5cbc96a2767f for instance with vm_state active and task_state resize_migrating.
Feb 19 19:26:13 compute-0 nova_compute[186662]: 2026-02-19 19:26:13.713 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:14 compute-0 nova_compute[186662]: 2026-02-19 19:26:14.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:26:15 compute-0 nova_compute[186662]: 2026-02-19 19:26:15.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:26:15 compute-0 nova_compute[186662]: 2026-02-19 19:26:15.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:26:15 compute-0 nova_compute[186662]: 2026-02-19 19:26:15.089 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:26:15 compute-0 nova_compute[186662]: 2026-02-19 19:26:15.089 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:26:15 compute-0 nova_compute[186662]: 2026-02-19 19:26:15.114 186666 WARNING neutronclient.v2_0.client [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:26:15 compute-0 nova_compute[186662]: 2026-02-19 19:26:15.254 186666 INFO nova.network.neutron [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Updating port e43829dc-7578-4c37-87d3-5cbc96a2767f with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Feb 19 19:26:15 compute-0 nova_compute[186662]: 2026-02-19 19:26:15.790 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-df77e346-76b1-4b06-8611-44d3ac9fc3ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:26:15 compute-0 nova_compute[186662]: 2026-02-19 19:26:15.791 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-df77e346-76b1-4b06-8611-44d3ac9fc3ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:26:15 compute-0 nova_compute[186662]: 2026-02-19 19:26:15.791 186666 DEBUG nova.network.neutron [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:26:15 compute-0 nova_compute[186662]: 2026-02-19 19:26:15.849 186666 DEBUG nova.compute.manager [req-ef1bf0de-5e75-4bf9-9216-71dc2bf4fac3 req-c0976e35-6f83-4320-8f31-f8765f45f11b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received event network-changed-e43829dc-7578-4c37-87d3-5cbc96a2767f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:26:15 compute-0 nova_compute[186662]: 2026-02-19 19:26:15.849 186666 DEBUG nova.compute.manager [req-ef1bf0de-5e75-4bf9-9216-71dc2bf4fac3 req-c0976e35-6f83-4320-8f31-f8765f45f11b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Refreshing instance network info cache due to event network-changed-e43829dc-7578-4c37-87d3-5cbc96a2767f. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:26:15 compute-0 nova_compute[186662]: 2026-02-19 19:26:15.849 186666 DEBUG oslo_concurrency.lockutils [req-ef1bf0de-5e75-4bf9-9216-71dc2bf4fac3 req-c0976e35-6f83-4320-8f31-f8765f45f11b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-df77e346-76b1-4b06-8611-44d3ac9fc3ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:26:16 compute-0 nova_compute[186662]: 2026-02-19 19:26:16.132 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:26:16 compute-0 nova_compute[186662]: 2026-02-19 19:26:16.173 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:26:16 compute-0 nova_compute[186662]: 2026-02-19 19:26:16.175 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:26:16 compute-0 nova_compute[186662]: 2026-02-19 19:26:16.216 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:26:16 compute-0 nova_compute[186662]: 2026-02-19 19:26:16.297 186666 WARNING neutronclient.v2_0.client [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:26:16 compute-0 nova_compute[186662]: 2026-02-19 19:26:16.366 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:26:16 compute-0 nova_compute[186662]: 2026-02-19 19:26:16.367 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:26:16 compute-0 nova_compute[186662]: 2026-02-19 19:26:16.384 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:26:16 compute-0 nova_compute[186662]: 2026-02-19 19:26:16.385 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5619MB free_disk=72.91902160644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:26:16 compute-0 nova_compute[186662]: 2026-02-19 19:26:16.385 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:26:16 compute-0 nova_compute[186662]: 2026-02-19 19:26:16.386 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:26:17 compute-0 nova_compute[186662]: 2026-02-19 19:26:17.169 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:17 compute-0 nova_compute[186662]: 2026-02-19 19:26:17.404 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Applying migration context for instance df77e346-76b1-4b06-8611-44d3ac9fc3ef as it has an incoming, in-progress migration 94cd3078-8823-48fe-b500-a06fe34e58fd. Migration status is post-migrating _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Feb 19 19:26:17 compute-0 nova_compute[186662]: 2026-02-19 19:26:17.406 186666 INFO nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Updating resource usage from migration 94cd3078-8823-48fe-b500-a06fe34e58fd
Feb 19 19:26:17 compute-0 nova_compute[186662]: 2026-02-19 19:26:17.467 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 72cc675e-4d5d-48c5-8c12-f9a42e168294 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:26:17 compute-0 nova_compute[186662]: 2026-02-19 19:26:17.468 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance df77e346-76b1-4b06-8611-44d3ac9fc3ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:26:17 compute-0 nova_compute[186662]: 2026-02-19 19:26:17.468 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:26:17 compute-0 nova_compute[186662]: 2026-02-19 19:26:17.469 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:26:16 up 57 min,  0 user,  load average: 0.46, 0.27, 0.36\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_resize_migrated': '1', 'num_os_type_None': '2', 'num_proj_84d22a8926d9401eb98cf092c0899a62': '2', 'io_workload': '0', 'num_task_None': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:26:17 compute-0 nova_compute[186662]: 2026-02-19 19:26:17.575 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:26:18 compute-0 nova_compute[186662]: 2026-02-19 19:26:18.081 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:26:18 compute-0 nova_compute[186662]: 2026-02-19 19:26:18.259 186666 WARNING neutronclient.v2_0.client [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:26:18 compute-0 nova_compute[186662]: 2026-02-19 19:26:18.431 186666 DEBUG nova.network.neutron [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Updating instance_info_cache with network_info: [{"id": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "address": "fa:16:3e:0c:fd:c8", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape43829dc-75", "ovs_interfaceid": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:26:18 compute-0 nova_compute[186662]: 2026-02-19 19:26:18.654 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:26:18 compute-0 nova_compute[186662]: 2026-02-19 19:26:18.654 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.268s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:26:18 compute-0 nova_compute[186662]: 2026-02-19 19:26:18.715 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:18 compute-0 nova_compute[186662]: 2026-02-19 19:26:18.966 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-df77e346-76b1-4b06-8611-44d3ac9fc3ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:26:18 compute-0 nova_compute[186662]: 2026-02-19 19:26:18.970 186666 DEBUG oslo_concurrency.lockutils [req-ef1bf0de-5e75-4bf9-9216-71dc2bf4fac3 req-c0976e35-6f83-4320-8f31-f8765f45f11b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-df77e346-76b1-4b06-8611-44d3ac9fc3ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:26:18 compute-0 nova_compute[186662]: 2026-02-19 19:26:18.971 186666 DEBUG nova.network.neutron [req-ef1bf0de-5e75-4bf9-9216-71dc2bf4fac3 req-c0976e35-6f83-4320-8f31-f8765f45f11b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Refreshing network info cache for port e43829dc-7578-4c37-87d3-5cbc96a2767f _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:26:19 compute-0 nova_compute[186662]: 2026-02-19 19:26:19.481 186666 WARNING neutronclient.v2_0.client [req-ef1bf0de-5e75-4bf9-9216-71dc2bf4fac3 req-c0976e35-6f83-4320-8f31-f8765f45f11b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:26:19 compute-0 nova_compute[186662]: 2026-02-19 19:26:19.523 186666 DEBUG nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Feb 19 19:26:19 compute-0 nova_compute[186662]: 2026-02-19 19:26:19.526 186666 DEBUG nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Feb 19 19:26:19 compute-0 nova_compute[186662]: 2026-02-19 19:26:19.526 186666 INFO nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Creating image(s)
Feb 19 19:26:19 compute-0 nova_compute[186662]: 2026-02-19 19:26:19.527 186666 DEBUG oslo_concurrency.processutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:26:19 compute-0 nova_compute[186662]: 2026-02-19 19:26:19.585 186666 DEBUG oslo_concurrency.processutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:26:19 compute-0 nova_compute[186662]: 2026-02-19 19:26:19.586 186666 DEBUG nova.virt.disk.api [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Checking if we can resize image /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:26:19 compute-0 nova_compute[186662]: 2026-02-19 19:26:19.586 186666 DEBUG oslo_concurrency.processutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:26:19 compute-0 nova_compute[186662]: 2026-02-19 19:26:19.628 186666 DEBUG oslo_concurrency.processutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:26:19 compute-0 nova_compute[186662]: 2026-02-19 19:26:19.628 186666 DEBUG nova.virt.disk.api [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Cannot resize image /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.134 186666 DEBUG nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.134 186666 DEBUG nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Ensure instance console log exists: /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.134 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.135 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.135 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.137 186666 DEBUG nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Start _get_guest_xml network_info=[{"id": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "address": "fa:16:3e:0c:fd:c8", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "vif_mac": "fa:16:3e:0c:fd:c8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape43829dc-75", "ovs_interfaceid": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.140 186666 WARNING nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.141 186666 DEBUG nova.virt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1723250421', uuid='df77e346-76b1-4b06-8611-44d3ac9fc3ef'), owner=OwnerMeta(userid='af924faf672a45b4b8708466af6eeb12', username='tempest-TestExecuteActionsViaActuator-1567565925-project-admin', projectid='84d22a8926d9401eb98cf092c0899a62', projectname='tempest-TestExecuteActionsViaActuator-1567565925'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.micro', flavorid='be514118-0e97-4b07-bca3-35af980bcd98', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "address": "fa:16:3e:0c:fd:c8", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "vif_mac": "fa:16:3e:0c:fd:c8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape43829dc-75", "ovs_interfaceid": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771529180.1412826) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.144 186666 DEBUG nova.virt.libvirt.host [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.145 186666 DEBUG nova.virt.libvirt.host [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.147 186666 DEBUG nova.virt.libvirt.host [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.147 186666 DEBUG nova.virt.libvirt.host [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.148 186666 DEBUG nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.148 186666 DEBUG nova.virt.hardware [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:34Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='be514118-0e97-4b07-bca3-35af980bcd98',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.149 186666 DEBUG nova.virt.hardware [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.149 186666 DEBUG nova.virt.hardware [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.149 186666 DEBUG nova.virt.hardware [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.149 186666 DEBUG nova.virt.hardware [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.149 186666 DEBUG nova.virt.hardware [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.149 186666 DEBUG nova.virt.hardware [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.150 186666 DEBUG nova.virt.hardware [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.150 186666 DEBUG nova.virt.hardware [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.150 186666 DEBUG nova.virt.hardware [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.150 186666 DEBUG nova.virt.hardware [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.153 186666 DEBUG oslo_concurrency.processutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.221 186666 DEBUG oslo_concurrency.processutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk.config --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.222 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "/var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.222 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "/var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.223 186666 DEBUG oslo_concurrency.lockutils [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "/var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.225 186666 DEBUG nova.virt.libvirt.vif [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:25:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1723250421',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1723250421',id=4,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:25:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-23170twp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:26:13Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=df77e346-76b1-4b06-8611-44d3ac9fc3ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "address": "fa:16:3e:0c:fd:c8", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "vif_mac": "fa:16:3e:0c:fd:c8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape43829dc-75", "ovs_interfaceid": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.225 186666 DEBUG nova.network.os_vif_util [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "address": "fa:16:3e:0c:fd:c8", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "vif_mac": "fa:16:3e:0c:fd:c8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape43829dc-75", "ovs_interfaceid": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.226 186666 DEBUG nova.network.os_vif_util [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=e43829dc-7578-4c37-87d3-5cbc96a2767f,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape43829dc-75') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.229 186666 DEBUG nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:26:20 compute-0 nova_compute[186662]:   <uuid>df77e346-76b1-4b06-8611-44d3ac9fc3ef</uuid>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   <name>instance-00000004</name>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   <memory>196608</memory>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1723250421</nova:name>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:26:20</nova:creationTime>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <nova:flavor name="m1.micro" id="be514118-0e97-4b07-bca3-35af980bcd98">
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:memory>192</nova:memory>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:26:20 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:26:20 compute-0 nova_compute[186662]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Feb 19 19:26:20 compute-0 nova_compute[186662]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Feb 19 19:26:20 compute-0 nova_compute[186662]:           <nova:property name="hw_input_bus">usb</nova:property>
Feb 19 19:26:20 compute-0 nova_compute[186662]:           <nova:property name="hw_machine_type">q35</nova:property>
Feb 19 19:26:20 compute-0 nova_compute[186662]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Feb 19 19:26:20 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:26:20 compute-0 nova_compute[186662]:           <nova:property name="hw_video_model">virtio</nova:property>
Feb 19 19:26:20 compute-0 nova_compute[186662]:           <nova:property name="hw_vif_model">virtio</nova:property>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:user uuid="af924faf672a45b4b8708466af6eeb12">tempest-TestExecuteActionsViaActuator-1567565925-project-admin</nova:user>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:project uuid="84d22a8926d9401eb98cf092c0899a62">tempest-TestExecuteActionsViaActuator-1567565925</nova:project>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         <nova:port uuid="e43829dc-7578-4c37-87d3-5cbc96a2767f">
Feb 19 19:26:20 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <system>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <entry name="serial">df77e346-76b1-4b06-8611-44d3ac9fc3ef</entry>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <entry name="uuid">df77e346-76b1-4b06-8611-44d3ac9fc3ef</entry>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     </system>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   <os>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   </os>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   <features>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   </features>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk.config"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:0c:fd:c8"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <target dev="tape43829dc-75"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/console.log" append="off"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <video>
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     </video>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:26:20 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:26:20 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:26:20 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:26:20 compute-0 nova_compute[186662]: </domain>
Feb 19 19:26:20 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.230 186666 DEBUG nova.virt.libvirt.vif [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:25:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1723250421',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1723250421',id=4,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:25:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-23170twp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:26:13Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=df77e346-76b1-4b06-8611-44d3ac9fc3ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "address": "fa:16:3e:0c:fd:c8", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "vif_mac": "fa:16:3e:0c:fd:c8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape43829dc-75", "ovs_interfaceid": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.231 186666 DEBUG nova.network.os_vif_util [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "address": "fa:16:3e:0c:fd:c8", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "vif_mac": "fa:16:3e:0c:fd:c8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape43829dc-75", "ovs_interfaceid": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.231 186666 DEBUG nova.network.os_vif_util [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=e43829dc-7578-4c37-87d3-5cbc96a2767f,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape43829dc-75') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.232 186666 DEBUG os_vif [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=e43829dc-7578-4c37-87d3-5cbc96a2767f,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape43829dc-75') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.232 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.233 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.233 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.234 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.234 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'b3494646-5f0b-57f9-ab31-76af0b1abe9c', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.235 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.236 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.239 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.240 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape43829dc-75, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.240 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tape43829dc-75, col_values=(('qos', UUID('e2ec8905-8a44-4544-9969-bc83ef4619a6')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.241 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tape43829dc-75, col_values=(('external_ids', {'iface-id': 'e43829dc-7578-4c37-87d3-5cbc96a2767f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0c:fd:c8', 'vm-uuid': 'df77e346-76b1-4b06-8611-44d3ac9fc3ef'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:26:20 compute-0 NetworkManager[56519]: <info>  [1771529180.2430] manager: (tape43829dc-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.244 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.248 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.248 186666 INFO os_vif [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=e43829dc-7578-4c37-87d3-5cbc96a2767f,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape43829dc-75')
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.598 186666 WARNING neutronclient.v2_0.client [req-ef1bf0de-5e75-4bf9-9216-71dc2bf4fac3 req-c0976e35-6f83-4320-8f31-f8765f45f11b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.760 186666 DEBUG nova.network.neutron [req-ef1bf0de-5e75-4bf9-9216-71dc2bf4fac3 req-c0976e35-6f83-4320-8f31-f8765f45f11b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Updated VIF entry in instance network info cache for port e43829dc-7578-4c37-87d3-5cbc96a2767f. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Feb 19 19:26:20 compute-0 nova_compute[186662]: 2026-02-19 19:26:20.761 186666 DEBUG nova.network.neutron [req-ef1bf0de-5e75-4bf9-9216-71dc2bf4fac3 req-c0976e35-6f83-4320-8f31-f8765f45f11b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Updating instance_info_cache with network_info: [{"id": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "address": "fa:16:3e:0c:fd:c8", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape43829dc-75", "ovs_interfaceid": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:26:21 compute-0 nova_compute[186662]: 2026-02-19 19:26:21.270 186666 DEBUG oslo_concurrency.lockutils [req-ef1bf0de-5e75-4bf9-9216-71dc2bf4fac3 req-c0976e35-6f83-4320-8f31-f8765f45f11b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-df77e346-76b1-4b06-8611-44d3ac9fc3ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:26:21 compute-0 nova_compute[186662]: 2026-02-19 19:26:21.796 186666 DEBUG nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:26:21 compute-0 nova_compute[186662]: 2026-02-19 19:26:21.796 186666 DEBUG nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:26:21 compute-0 nova_compute[186662]: 2026-02-19 19:26:21.796 186666 DEBUG nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No VIF found with MAC fa:16:3e:0c:fd:c8, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:26:21 compute-0 nova_compute[186662]: 2026-02-19 19:26:21.797 186666 INFO nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Using config drive
Feb 19 19:26:21 compute-0 kernel: tape43829dc-75: entered promiscuous mode
Feb 19 19:26:21 compute-0 NetworkManager[56519]: <info>  [1771529181.8366] manager: (tape43829dc-75): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Feb 19 19:26:21 compute-0 ovn_controller[96653]: 2026-02-19T19:26:21Z|00054|binding|INFO|Claiming lport e43829dc-7578-4c37-87d3-5cbc96a2767f for this chassis.
Feb 19 19:26:21 compute-0 ovn_controller[96653]: 2026-02-19T19:26:21Z|00055|binding|INFO|e43829dc-7578-4c37-87d3-5cbc96a2767f: Claiming fa:16:3e:0c:fd:c8 10.100.0.11
Feb 19 19:26:21 compute-0 nova_compute[186662]: 2026-02-19 19:26:21.838 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:21 compute-0 ovn_controller[96653]: 2026-02-19T19:26:21Z|00056|binding|INFO|Setting lport e43829dc-7578-4c37-87d3-5cbc96a2767f ovn-installed in OVS
Feb 19 19:26:21 compute-0 ovn_controller[96653]: 2026-02-19T19:26:21Z|00057|binding|INFO|Setting lport e43829dc-7578-4c37-87d3-5cbc96a2767f up in Southbound
Feb 19 19:26:21 compute-0 nova_compute[186662]: 2026-02-19 19:26:21.845 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.846 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:fd:c8 10.100.0.11'], port_security=['fa:16:3e:0c:fd:c8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'df77e346-76b1-4b06-8611-44d3ac9fc3ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '9', 'neutron:security_group_ids': '286c3adc-3464-487a-ae0e-8231188cef8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46bbe139-eec8-4de4-bfdc-b507967fb452, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=e43829dc-7578-4c37-87d3-5cbc96a2767f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.847 105986 INFO neutron.agent.ovn.metadata.agent [-] Port e43829dc-7578-4c37-87d3-5cbc96a2767f in datapath 23744514-9581-483b-ba8d-38106bcd89ef bound to our chassis
Feb 19 19:26:21 compute-0 nova_compute[186662]: 2026-02-19 19:26:21.849 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.850 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 23744514-9581-483b-ba8d-38106bcd89ef
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.864 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[39f183e2-8fb3-45e9-a2f0-997bcf93df82]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:26:21 compute-0 systemd-udevd[209388]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:26:21 compute-0 systemd-machined[156014]: New machine qemu-3-instance-00000004.
Feb 19 19:26:21 compute-0 NetworkManager[56519]: <info>  [1771529181.8862] device (tape43829dc-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:26:21 compute-0 NetworkManager[56519]: <info>  [1771529181.8870] device (tape43829dc-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.890 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[fe0f205b-4189-485e-869b-ce243c1d978f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.892 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[69a79e9e-e154-4b52-8292-961448fc11ae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:26:21 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.912 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[779ccc14-bfbf-4818-a84a-533114671977]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.926 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a630dd77-a09e-44d9-b6a2-2c81bb8b4095]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23744514-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:f0:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341570, 'reachable_time': 39245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209395, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.942 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[9035728b-d4e5-49ab-97ed-8477cf56afb3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341576, 'tstamp': 341576}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209401, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341578, 'tstamp': 341578}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209401, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.943 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23744514-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:26:21 compute-0 nova_compute[186662]: 2026-02-19 19:26:21.966 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:21 compute-0 nova_compute[186662]: 2026-02-19 19:26:21.967 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.967 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23744514-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.968 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.968 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap23744514-90, col_values=(('external_ids', {'iface-id': '25b16785-ed71-4ba6-a91b-c4dcee5ff875'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.968 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:26:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:21.969 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a89eb4-2d46-4442-b2df-752b3259c64c]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-23744514-9581-483b-ba8d-38106bcd89ef\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 23744514-9581-483b-ba8d-38106bcd89ef\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:26:22 compute-0 nova_compute[186662]: 2026-02-19 19:26:22.113 186666 DEBUG nova.compute.manager [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:26:22 compute-0 nova_compute[186662]: 2026-02-19 19:26:22.117 186666 INFO nova.virt.libvirt.driver [-] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Instance running successfully.
Feb 19 19:26:22 compute-0 virtqemud[186157]: argument unsupported: QEMU guest agent is not configured
Feb 19 19:26:22 compute-0 nova_compute[186662]: 2026-02-19 19:26:22.119 186666 DEBUG nova.virt.libvirt.guest [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Feb 19 19:26:22 compute-0 nova_compute[186662]: 2026-02-19 19:26:22.119 186666 DEBUG nova.virt.libvirt.driver [None req-bc88a54f-e710-4de4-baa0-41ce863ab1de 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Feb 19 19:26:22 compute-0 nova_compute[186662]: 2026-02-19 19:26:22.171 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:22 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 19 19:26:22 compute-0 systemd[209302]: Activating special unit Exit the Session...
Feb 19 19:26:22 compute-0 systemd[209302]: Stopped target Main User Target.
Feb 19 19:26:22 compute-0 systemd[209302]: Stopped target Basic System.
Feb 19 19:26:22 compute-0 systemd[209302]: Stopped target Paths.
Feb 19 19:26:22 compute-0 systemd[209302]: Stopped target Sockets.
Feb 19 19:26:22 compute-0 systemd[209302]: Stopped target Timers.
Feb 19 19:26:22 compute-0 systemd[209302]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 19 19:26:22 compute-0 systemd[209302]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 19 19:26:22 compute-0 systemd[209302]: Closed D-Bus User Message Bus Socket.
Feb 19 19:26:22 compute-0 systemd[209302]: Stopped Create User's Volatile Files and Directories.
Feb 19 19:26:22 compute-0 systemd[209302]: Removed slice User Application Slice.
Feb 19 19:26:22 compute-0 systemd[209302]: Reached target Shutdown.
Feb 19 19:26:22 compute-0 systemd[209302]: Finished Exit the Session.
Feb 19 19:26:22 compute-0 systemd[209302]: Reached target Exit the Session.
Feb 19 19:26:22 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 19 19:26:22 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 19 19:26:22 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 19 19:26:22 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 19 19:26:22 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 19 19:26:22 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 19 19:26:22 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 19 19:26:22 compute-0 nova_compute[186662]: 2026-02-19 19:26:22.660 186666 DEBUG nova.compute.manager [req-1ca4c26f-dd67-4fd1-918f-bc45ba30caa2 req-1c41f2bf-e7af-4d01-89c7-da92fa8c6758 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received event network-vif-plugged-e43829dc-7578-4c37-87d3-5cbc96a2767f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:26:22 compute-0 nova_compute[186662]: 2026-02-19 19:26:22.661 186666 DEBUG oslo_concurrency.lockutils [req-1ca4c26f-dd67-4fd1-918f-bc45ba30caa2 req-1c41f2bf-e7af-4d01-89c7-da92fa8c6758 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:26:22 compute-0 nova_compute[186662]: 2026-02-19 19:26:22.661 186666 DEBUG oslo_concurrency.lockutils [req-1ca4c26f-dd67-4fd1-918f-bc45ba30caa2 req-1c41f2bf-e7af-4d01-89c7-da92fa8c6758 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:26:22 compute-0 nova_compute[186662]: 2026-02-19 19:26:22.661 186666 DEBUG oslo_concurrency.lockutils [req-1ca4c26f-dd67-4fd1-918f-bc45ba30caa2 req-1c41f2bf-e7af-4d01-89c7-da92fa8c6758 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:26:22 compute-0 nova_compute[186662]: 2026-02-19 19:26:22.661 186666 DEBUG nova.compute.manager [req-1ca4c26f-dd67-4fd1-918f-bc45ba30caa2 req-1c41f2bf-e7af-4d01-89c7-da92fa8c6758 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] No waiting events found dispatching network-vif-plugged-e43829dc-7578-4c37-87d3-5cbc96a2767f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:26:22 compute-0 nova_compute[186662]: 2026-02-19 19:26:22.662 186666 WARNING nova.compute.manager [req-1ca4c26f-dd67-4fd1-918f-bc45ba30caa2 req-1c41f2bf-e7af-4d01-89c7-da92fa8c6758 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received unexpected event network-vif-plugged-e43829dc-7578-4c37-87d3-5cbc96a2767f for instance with vm_state active and task_state resize_finish.
Feb 19 19:26:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:22.697 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:26:22 compute-0 nova_compute[186662]: 2026-02-19 19:26:22.698 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:22.699 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:26:24 compute-0 nova_compute[186662]: 2026-02-19 19:26:24.752 186666 DEBUG nova.compute.manager [req-b1e0c3df-b99c-46c6-a209-60a1cc521de5 req-6d260d8f-0f0c-40df-8cdb-f66cdf310020 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received event network-vif-plugged-e43829dc-7578-4c37-87d3-5cbc96a2767f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:26:24 compute-0 nova_compute[186662]: 2026-02-19 19:26:24.755 186666 DEBUG oslo_concurrency.lockutils [req-b1e0c3df-b99c-46c6-a209-60a1cc521de5 req-6d260d8f-0f0c-40df-8cdb-f66cdf310020 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:26:24 compute-0 nova_compute[186662]: 2026-02-19 19:26:24.756 186666 DEBUG oslo_concurrency.lockutils [req-b1e0c3df-b99c-46c6-a209-60a1cc521de5 req-6d260d8f-0f0c-40df-8cdb-f66cdf310020 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:26:24 compute-0 nova_compute[186662]: 2026-02-19 19:26:24.756 186666 DEBUG oslo_concurrency.lockutils [req-b1e0c3df-b99c-46c6-a209-60a1cc521de5 req-6d260d8f-0f0c-40df-8cdb-f66cdf310020 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:26:24 compute-0 nova_compute[186662]: 2026-02-19 19:26:24.757 186666 DEBUG nova.compute.manager [req-b1e0c3df-b99c-46c6-a209-60a1cc521de5 req-6d260d8f-0f0c-40df-8cdb-f66cdf310020 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] No waiting events found dispatching network-vif-plugged-e43829dc-7578-4c37-87d3-5cbc96a2767f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:26:24 compute-0 nova_compute[186662]: 2026-02-19 19:26:24.757 186666 WARNING nova.compute.manager [req-b1e0c3df-b99c-46c6-a209-60a1cc521de5 req-6d260d8f-0f0c-40df-8cdb-f66cdf310020 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received unexpected event network-vif-plugged-e43829dc-7578-4c37-87d3-5cbc96a2767f for instance with vm_state resized and task_state None.
Feb 19 19:26:25 compute-0 nova_compute[186662]: 2026-02-19 19:26:25.242 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:25 compute-0 podman[209414]: 2026-02-19 19:26:25.28511468 +0000 UTC m=+0.059468581 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 19 19:26:27 compute-0 nova_compute[186662]: 2026-02-19 19:26:27.173 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:27 compute-0 podman[209434]: 2026-02-19 19:26:27.281480865 +0000 UTC m=+0.056854928 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, release=1770267347, architecture=x86_64, config_id=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 19 19:26:29 compute-0 podman[209455]: 2026-02-19 19:26:29.331335246 +0000 UTC m=+0.103506100 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216)
Feb 19 19:26:29 compute-0 podman[196025]: time="2026-02-19T19:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:26:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17220 "" "Go-http-client/1.1"
Feb 19 19:26:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2625 "" "Go-http-client/1.1"
Feb 19 19:26:30 compute-0 nova_compute[186662]: 2026-02-19 19:26:30.245 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:30 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:30.700 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:26:31 compute-0 openstack_network_exporter[198916]: ERROR   19:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:26:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:26:31 compute-0 openstack_network_exporter[198916]: ERROR   19:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:26:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:26:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:32.119 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:26:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:32.119 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:26:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:26:32.119 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:26:32 compute-0 nova_compute[186662]: 2026-02-19 19:26:32.175 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:33 compute-0 ovn_controller[96653]: 2026-02-19T19:26:33Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0c:fd:c8 10.100.0.11
Feb 19 19:26:35 compute-0 nova_compute[186662]: 2026-02-19 19:26:35.247 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:37 compute-0 sshd-session[209488]: Invalid user nutanix from 189.165.79.177 port 42272
Feb 19 19:26:37 compute-0 sshd-session[209488]: Received disconnect from 189.165.79.177 port 42272:11: Bye Bye [preauth]
Feb 19 19:26:37 compute-0 sshd-session[209488]: Disconnected from invalid user nutanix 189.165.79.177 port 42272 [preauth]
Feb 19 19:26:37 compute-0 nova_compute[186662]: 2026-02-19 19:26:37.225 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:38 compute-0 podman[209490]: 2026-02-19 19:26:38.30679165 +0000 UTC m=+0.074436781 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:26:40 compute-0 nova_compute[186662]: 2026-02-19 19:26:40.251 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:42 compute-0 nova_compute[186662]: 2026-02-19 19:26:42.226 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:45 compute-0 nova_compute[186662]: 2026-02-19 19:26:45.252 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:47 compute-0 nova_compute[186662]: 2026-02-19 19:26:47.230 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:49 compute-0 sshd-session[209514]: Invalid user systemd from 197.211.55.20 port 39936
Feb 19 19:26:49 compute-0 sshd-session[209514]: Received disconnect from 197.211.55.20 port 39936:11: Bye Bye [preauth]
Feb 19 19:26:49 compute-0 sshd-session[209514]: Disconnected from invalid user systemd 197.211.55.20 port 39936 [preauth]
Feb 19 19:26:50 compute-0 nova_compute[186662]: 2026-02-19 19:26:50.255 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:52 compute-0 nova_compute[186662]: 2026-02-19 19:26:52.232 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:55 compute-0 nova_compute[186662]: 2026-02-19 19:26:55.258 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:55 compute-0 sshd-session[209516]: Invalid user ftp-test from 106.51.64.128 port 49121
Feb 19 19:26:55 compute-0 podman[209518]: 2026-02-19 19:26:55.946836983 +0000 UTC m=+0.053972388 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest)
Feb 19 19:26:56 compute-0 sshd-session[209516]: Received disconnect from 106.51.64.128 port 49121:11: Bye Bye [preauth]
Feb 19 19:26:56 compute-0 sshd-session[209516]: Disconnected from invalid user ftp-test 106.51.64.128 port 49121 [preauth]
Feb 19 19:26:57 compute-0 nova_compute[186662]: 2026-02-19 19:26:57.234 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:26:57 compute-0 nova_compute[186662]: 2026-02-19 19:26:57.476 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "9176d9ab-648d-4303-8d54-b208a7e1d395" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:26:57 compute-0 nova_compute[186662]: 2026-02-19 19:26:57.477 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:26:57 compute-0 nova_compute[186662]: 2026-02-19 19:26:57.983 186666 DEBUG nova.compute.manager [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Feb 19 19:26:58 compute-0 podman[209537]: 2026-02-19 19:26:58.284243902 +0000 UTC m=+0.053652271 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, release=1770267347, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9/ubi-minimal)
Feb 19 19:26:58 compute-0 nova_compute[186662]: 2026-02-19 19:26:58.518 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:26:58 compute-0 nova_compute[186662]: 2026-02-19 19:26:58.519 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:26:58 compute-0 nova_compute[186662]: 2026-02-19 19:26:58.528 186666 DEBUG nova.virt.hardware [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:26:58 compute-0 nova_compute[186662]: 2026-02-19 19:26:58.528 186666 INFO nova.compute.claims [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:26:59 compute-0 nova_compute[186662]: 2026-02-19 19:26:59.609 186666 DEBUG nova.compute.provider_tree [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:26:59 compute-0 podman[196025]: time="2026-02-19T19:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:26:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17220 "" "Go-http-client/1.1"
Feb 19 19:26:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2627 "" "Go-http-client/1.1"
Feb 19 19:27:00 compute-0 nova_compute[186662]: 2026-02-19 19:27:00.116 186666 DEBUG nova.scheduler.client.report [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:27:00 compute-0 nova_compute[186662]: 2026-02-19 19:27:00.261 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:00 compute-0 podman[209572]: 2026-02-19 19:27:00.31151497 +0000 UTC m=+0.083971061 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Feb 19 19:27:00 compute-0 nova_compute[186662]: 2026-02-19 19:27:00.630 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:00 compute-0 nova_compute[186662]: 2026-02-19 19:27:00.631 186666 DEBUG nova.compute.manager [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Feb 19 19:27:01 compute-0 nova_compute[186662]: 2026-02-19 19:27:01.154 186666 DEBUG nova.compute.manager [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Feb 19 19:27:01 compute-0 nova_compute[186662]: 2026-02-19 19:27:01.154 186666 DEBUG nova.network.neutron [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Feb 19 19:27:01 compute-0 nova_compute[186662]: 2026-02-19 19:27:01.155 186666 WARNING neutronclient.v2_0.client [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:27:01 compute-0 nova_compute[186662]: 2026-02-19 19:27:01.155 186666 WARNING neutronclient.v2_0.client [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:27:01 compute-0 openstack_network_exporter[198916]: ERROR   19:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:27:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:27:01 compute-0 openstack_network_exporter[198916]: ERROR   19:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:27:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:27:01 compute-0 nova_compute[186662]: 2026-02-19 19:27:01.664 186666 INFO nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 19 19:27:01 compute-0 nova_compute[186662]: 2026-02-19 19:27:01.939 186666 DEBUG nova.network.neutron [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Successfully created port: 61c38173-e58d-41f4-a6e7-aa24531b3ed2 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Feb 19 19:27:02 compute-0 nova_compute[186662]: 2026-02-19 19:27:02.180 186666 DEBUG nova.compute.manager [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Feb 19 19:27:02 compute-0 nova_compute[186662]: 2026-02-19 19:27:02.236 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:02 compute-0 nova_compute[186662]: 2026-02-19 19:27:02.825 186666 DEBUG nova.network.neutron [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Successfully updated port: 61c38173-e58d-41f4-a6e7-aa24531b3ed2 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Feb 19 19:27:02 compute-0 nova_compute[186662]: 2026-02-19 19:27:02.900 186666 DEBUG nova.compute.manager [req-49b36edb-7609-4c23-a8ca-70cd3069a946 req-8856a5c0-8b26-4fcd-bbe1-b55fe13c0806 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Received event network-changed-61c38173-e58d-41f4-a6e7-aa24531b3ed2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:27:02 compute-0 nova_compute[186662]: 2026-02-19 19:27:02.900 186666 DEBUG nova.compute.manager [req-49b36edb-7609-4c23-a8ca-70cd3069a946 req-8856a5c0-8b26-4fcd-bbe1-b55fe13c0806 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Refreshing instance network info cache due to event network-changed-61c38173-e58d-41f4-a6e7-aa24531b3ed2. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:27:02 compute-0 nova_compute[186662]: 2026-02-19 19:27:02.901 186666 DEBUG oslo_concurrency.lockutils [req-49b36edb-7609-4c23-a8ca-70cd3069a946 req-8856a5c0-8b26-4fcd-bbe1-b55fe13c0806 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-9176d9ab-648d-4303-8d54-b208a7e1d395" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:27:02 compute-0 nova_compute[186662]: 2026-02-19 19:27:02.901 186666 DEBUG oslo_concurrency.lockutils [req-49b36edb-7609-4c23-a8ca-70cd3069a946 req-8856a5c0-8b26-4fcd-bbe1-b55fe13c0806 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-9176d9ab-648d-4303-8d54-b208a7e1d395" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:27:02 compute-0 nova_compute[186662]: 2026-02-19 19:27:02.902 186666 DEBUG nova.network.neutron [req-49b36edb-7609-4c23-a8ca-70cd3069a946 req-8856a5c0-8b26-4fcd-bbe1-b55fe13c0806 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Refreshing network info cache for port 61c38173-e58d-41f4-a6e7-aa24531b3ed2 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.203 186666 DEBUG nova.compute.manager [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.205 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.206 186666 INFO nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Creating image(s)
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.208 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "/var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.208 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "/var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.209 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "/var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.210 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.217 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.219 186666 DEBUG oslo_concurrency.processutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.274 186666 DEBUG oslo_concurrency.processutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.275 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.277 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.279 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.285 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.286 186666 DEBUG oslo_concurrency.processutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.333 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "refresh_cache-9176d9ab-648d-4303-8d54-b208a7e1d395" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.338 186666 DEBUG oslo_concurrency.processutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.339 186666 DEBUG oslo_concurrency.processutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.372 186666 DEBUG oslo_concurrency.processutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.373 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.374 186666 DEBUG oslo_concurrency.processutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.407 186666 WARNING neutronclient.v2_0.client [req-49b36edb-7609-4c23-a8ca-70cd3069a946 req-8856a5c0-8b26-4fcd-bbe1-b55fe13c0806 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.427 186666 DEBUG oslo_concurrency.processutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.428 186666 DEBUG nova.virt.disk.api [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Checking if we can resize image /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.429 186666 DEBUG oslo_concurrency.processutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.504 186666 DEBUG oslo_concurrency.processutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.506 186666 DEBUG nova.virt.disk.api [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Cannot resize image /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.507 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.507 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Ensure instance console log exists: /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.508 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.509 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.509 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.542 186666 DEBUG nova.network.neutron [req-49b36edb-7609-4c23-a8ca-70cd3069a946 req-8856a5c0-8b26-4fcd-bbe1-b55fe13c0806 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:27:03 compute-0 nova_compute[186662]: 2026-02-19 19:27:03.754 186666 DEBUG nova.network.neutron [req-49b36edb-7609-4c23-a8ca-70cd3069a946 req-8856a5c0-8b26-4fcd-bbe1-b55fe13c0806 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:27:04 compute-0 nova_compute[186662]: 2026-02-19 19:27:04.260 186666 DEBUG oslo_concurrency.lockutils [req-49b36edb-7609-4c23-a8ca-70cd3069a946 req-8856a5c0-8b26-4fcd-bbe1-b55fe13c0806 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-9176d9ab-648d-4303-8d54-b208a7e1d395" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:27:04 compute-0 nova_compute[186662]: 2026-02-19 19:27:04.261 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquired lock "refresh_cache-9176d9ab-648d-4303-8d54-b208a7e1d395" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:27:04 compute-0 nova_compute[186662]: 2026-02-19 19:27:04.261 186666 DEBUG nova.network.neutron [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:27:05 compute-0 nova_compute[186662]: 2026-02-19 19:27:05.165 186666 DEBUG nova.network.neutron [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:27:05 compute-0 nova_compute[186662]: 2026-02-19 19:27:05.263 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:05 compute-0 nova_compute[186662]: 2026-02-19 19:27:05.396 186666 WARNING neutronclient.v2_0.client [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:27:05 compute-0 nova_compute[186662]: 2026-02-19 19:27:05.756 186666 DEBUG nova.network.neutron [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Updating instance_info_cache with network_info: [{"id": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "address": "fa:16:3e:12:3f:ba", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c38173-e5", "ovs_interfaceid": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.263 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Releasing lock "refresh_cache-9176d9ab-648d-4303-8d54-b208a7e1d395" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.263 186666 DEBUG nova.compute.manager [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Instance network_info: |[{"id": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "address": "fa:16:3e:12:3f:ba", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c38173-e5", "ovs_interfaceid": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.266 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Start _get_guest_xml network_info=[{"id": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "address": "fa:16:3e:12:3f:ba", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c38173-e5", "ovs_interfaceid": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.270 186666 WARNING nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.272 186666 DEBUG nova.virt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1665463927', uuid='9176d9ab-648d-4303-8d54-b208a7e1d395'), owner=OwnerMeta(userid='af924faf672a45b4b8708466af6eeb12', username='tempest-TestExecuteActionsViaActuator-1567565925-project-admin', projectid='84d22a8926d9401eb98cf092c0899a62', projectname='tempest-TestExecuteActionsViaActuator-1567565925'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "address": "fa:16:3e:12:3f:ba", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c38173-e5", "ovs_interfaceid": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771529226.2719524) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.278 186666 DEBUG nova.virt.libvirt.host [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.278 186666 DEBUG nova.virt.libvirt.host [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.282 186666 DEBUG nova.virt.libvirt.host [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.283 186666 DEBUG nova.virt.libvirt.host [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.284 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.284 186666 DEBUG nova.virt.hardware [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:33Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.284 186666 DEBUG nova.virt.hardware [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.284 186666 DEBUG nova.virt.hardware [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.285 186666 DEBUG nova.virt.hardware [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.285 186666 DEBUG nova.virt.hardware [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.285 186666 DEBUG nova.virt.hardware [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.285 186666 DEBUG nova.virt.hardware [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.285 186666 DEBUG nova.virt.hardware [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.286 186666 DEBUG nova.virt.hardware [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.286 186666 DEBUG nova.virt.hardware [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.286 186666 DEBUG nova.virt.hardware [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.289 186666 DEBUG nova.virt.libvirt.vif [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:26:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1665463927',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1665463927',id=7,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-6oueca0b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:27:02Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=9176d9ab-648d-4303-8d54-b208a7e1d395,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "address": "fa:16:3e:12:3f:ba", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c38173-e5", "ovs_interfaceid": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.290 186666 DEBUG nova.network.os_vif_util [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converting VIF {"id": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "address": "fa:16:3e:12:3f:ba", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c38173-e5", "ovs_interfaceid": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.290 186666 DEBUG nova.network.os_vif_util [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:3f:ba,bridge_name='br-int',has_traffic_filtering=True,id=61c38173-e58d-41f4-a6e7-aa24531b3ed2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c38173-e5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.291 186666 DEBUG nova.objects.instance [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9176d9ab-648d-4303-8d54-b208a7e1d395 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.797 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:27:06 compute-0 nova_compute[186662]:   <uuid>9176d9ab-648d-4303-8d54-b208a7e1d395</uuid>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   <name>instance-00000007</name>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   <memory>131072</memory>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1665463927</nova:name>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:27:06</nova:creationTime>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:27:06 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:27:06 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:user uuid="af924faf672a45b4b8708466af6eeb12">tempest-TestExecuteActionsViaActuator-1567565925-project-admin</nova:user>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:project uuid="84d22a8926d9401eb98cf092c0899a62">tempest-TestExecuteActionsViaActuator-1567565925</nova:project>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         <nova:port uuid="61c38173-e58d-41f4-a6e7-aa24531b3ed2">
Feb 19 19:27:06 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <system>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <entry name="serial">9176d9ab-648d-4303-8d54-b208a7e1d395</entry>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <entry name="uuid">9176d9ab-648d-4303-8d54-b208a7e1d395</entry>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     </system>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   <os>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   </os>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   <features>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   </features>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk.config"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:12:3f:ba"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <target dev="tap61c38173-e5"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/console.log" append="off"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <video>
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     </video>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:27:06 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:27:06 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:27:06 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:27:06 compute-0 nova_compute[186662]: </domain>
Feb 19 19:27:06 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.799 186666 DEBUG nova.compute.manager [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Preparing to wait for external event network-vif-plugged-61c38173-e58d-41f4-a6e7-aa24531b3ed2 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.799 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.800 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.800 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.801 186666 DEBUG nova.virt.libvirt.vif [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:26:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1665463927',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1665463927',id=7,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-6oueca0b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:27:02Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=9176d9ab-648d-4303-8d54-b208a7e1d395,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "address": "fa:16:3e:12:3f:ba", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c38173-e5", "ovs_interfaceid": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.802 186666 DEBUG nova.network.os_vif_util [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converting VIF {"id": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "address": "fa:16:3e:12:3f:ba", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c38173-e5", "ovs_interfaceid": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.803 186666 DEBUG nova.network.os_vif_util [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:3f:ba,bridge_name='br-int',has_traffic_filtering=True,id=61c38173-e58d-41f4-a6e7-aa24531b3ed2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c38173-e5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.804 186666 DEBUG os_vif [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:3f:ba,bridge_name='br-int',has_traffic_filtering=True,id=61c38173-e58d-41f4-a6e7-aa24531b3ed2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c38173-e5') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.805 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.805 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.806 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.808 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.808 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8eed36ce-23d7-55db-9f02-d2a1dfc33d98', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.810 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.812 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.815 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.815 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61c38173-e5, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.815 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap61c38173-e5, col_values=(('qos', UUID('9297d20e-e7fb-4f35-b382-acc7a6ea7538')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.816 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap61c38173-e5, col_values=(('external_ids', {'iface-id': '61c38173-e58d-41f4-a6e7-aa24531b3ed2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:12:3f:ba', 'vm-uuid': '9176d9ab-648d-4303-8d54-b208a7e1d395'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.816 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:06 compute-0 NetworkManager[56519]: <info>  [1771529226.8178] manager: (tap61c38173-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.819 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.822 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:06 compute-0 nova_compute[186662]: 2026-02-19 19:27:06.823 186666 INFO os_vif [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:3f:ba,bridge_name='br-int',has_traffic_filtering=True,id=61c38173-e58d-41f4-a6e7-aa24531b3ed2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c38173-e5')
Feb 19 19:27:07 compute-0 nova_compute[186662]: 2026-02-19 19:27:07.241 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:08 compute-0 nova_compute[186662]: 2026-02-19 19:27:08.365 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:27:08 compute-0 nova_compute[186662]: 2026-02-19 19:27:08.365 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:27:08 compute-0 nova_compute[186662]: 2026-02-19 19:27:08.365 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] No VIF found with MAC fa:16:3e:12:3f:ba, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:27:08 compute-0 nova_compute[186662]: 2026-02-19 19:27:08.366 186666 INFO nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Using config drive
Feb 19 19:27:08 compute-0 nova_compute[186662]: 2026-02-19 19:27:08.654 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:27:08 compute-0 nova_compute[186662]: 2026-02-19 19:27:08.655 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:27:08 compute-0 nova_compute[186662]: 2026-02-19 19:27:08.655 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:27:08 compute-0 nova_compute[186662]: 2026-02-19 19:27:08.655 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:27:08 compute-0 nova_compute[186662]: 2026-02-19 19:27:08.656 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:27:08 compute-0 nova_compute[186662]: 2026-02-19 19:27:08.875 186666 WARNING neutronclient.v2_0.client [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:27:09 compute-0 podman[209616]: 2026-02-19 19:27:09.297165074 +0000 UTC m=+0.073010587 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:27:09 compute-0 nova_compute[186662]: 2026-02-19 19:27:09.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:27:09 compute-0 nova_compute[186662]: 2026-02-19 19:27:09.667 186666 INFO nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Creating config drive at /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk.config
Feb 19 19:27:09 compute-0 nova_compute[186662]: 2026-02-19 19:27:09.673 186666 DEBUG oslo_concurrency.processutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpe8r5nrf9 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:09 compute-0 nova_compute[186662]: 2026-02-19 19:27:09.798 186666 DEBUG oslo_concurrency.processutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpe8r5nrf9" returned: 0 in 0.125s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:09 compute-0 kernel: tap61c38173-e5: entered promiscuous mode
Feb 19 19:27:09 compute-0 NetworkManager[56519]: <info>  [1771529229.8549] manager: (tap61c38173-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Feb 19 19:27:09 compute-0 ovn_controller[96653]: 2026-02-19T19:27:09Z|00058|binding|INFO|Claiming lport 61c38173-e58d-41f4-a6e7-aa24531b3ed2 for this chassis.
Feb 19 19:27:09 compute-0 ovn_controller[96653]: 2026-02-19T19:27:09Z|00059|binding|INFO|61c38173-e58d-41f4-a6e7-aa24531b3ed2: Claiming fa:16:3e:12:3f:ba 10.100.0.4
Feb 19 19:27:09 compute-0 nova_compute[186662]: 2026-02-19 19:27:09.900 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:09.907 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:12:3f:ba 10.100.0.4'], port_security=['fa:16:3e:12:3f:ba 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9176d9ab-648d-4303-8d54-b208a7e1d395', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '4', 'neutron:security_group_ids': '286c3adc-3464-487a-ae0e-8231188cef8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46bbe139-eec8-4de4-bfdc-b507967fb452, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=61c38173-e58d-41f4-a6e7-aa24531b3ed2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:27:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:09.908 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 61c38173-e58d-41f4-a6e7-aa24531b3ed2 in datapath 23744514-9581-483b-ba8d-38106bcd89ef bound to our chassis
Feb 19 19:27:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:09.911 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 23744514-9581-483b-ba8d-38106bcd89ef
Feb 19 19:27:09 compute-0 ovn_controller[96653]: 2026-02-19T19:27:09Z|00060|binding|INFO|Setting lport 61c38173-e58d-41f4-a6e7-aa24531b3ed2 up in Southbound
Feb 19 19:27:09 compute-0 ovn_controller[96653]: 2026-02-19T19:27:09Z|00061|binding|INFO|Setting lport 61c38173-e58d-41f4-a6e7-aa24531b3ed2 ovn-installed in OVS
Feb 19 19:27:09 compute-0 nova_compute[186662]: 2026-02-19 19:27:09.914 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:09 compute-0 nova_compute[186662]: 2026-02-19 19:27:09.917 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:09 compute-0 systemd-machined[156014]: New machine qemu-4-instance-00000007.
Feb 19 19:27:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:09.927 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[af76dd66-c92e-4a2f-a89e-7397ac47112d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:09 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000007.
Feb 19 19:27:09 compute-0 systemd-udevd[209659]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:27:09 compute-0 NetworkManager[56519]: <info>  [1771529229.9476] device (tap61c38173-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:27:09 compute-0 NetworkManager[56519]: <info>  [1771529229.9484] device (tap61c38173-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:27:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:09.958 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[446f70c0-d447-4bf3-bbd3-18616aa48f99]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:09.961 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[9499f5b6-e8b6-43b5-aa78-a5ac23cf48fe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:09.977 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a9d71b-077c-47a6-915d-dc4be5498a82]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:09.988 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[645a3fdc-1839-42fa-a39e-41eb94ee6ab4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23744514-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:f0:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341570, 'reachable_time': 39245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209671, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:09.998 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd2818f-f752-44f0-a219-b883048ed67d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341576, 'tstamp': 341576}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209672, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341578, 'tstamp': 341578}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209672, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:10.000 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23744514-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.001 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.002 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:10.003 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23744514-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:10.003 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:27:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:10.003 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap23744514-90, col_values=(('external_ids', {'iface-id': '25b16785-ed71-4ba6-a91b-c4dcee5ff875'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:10.004 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:27:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:10.005 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a3fbe388-fb0c-44c3-a844-2cc45e3df9ac]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-23744514-9581-483b-ba8d-38106bcd89ef\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 23744514-9581-483b-ba8d-38106bcd89ef\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.246 186666 DEBUG nova.compute.manager [req-be61a1e8-d1a9-4152-b5ed-b2e50448fb30 req-7f2489b0-7572-4efd-895f-76d5a55c2509 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Received event network-vif-plugged-61c38173-e58d-41f4-a6e7-aa24531b3ed2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.246 186666 DEBUG oslo_concurrency.lockutils [req-be61a1e8-d1a9-4152-b5ed-b2e50448fb30 req-7f2489b0-7572-4efd-895f-76d5a55c2509 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.247 186666 DEBUG oslo_concurrency.lockutils [req-be61a1e8-d1a9-4152-b5ed-b2e50448fb30 req-7f2489b0-7572-4efd-895f-76d5a55c2509 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.247 186666 DEBUG oslo_concurrency.lockutils [req-be61a1e8-d1a9-4152-b5ed-b2e50448fb30 req-7f2489b0-7572-4efd-895f-76d5a55c2509 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.247 186666 DEBUG nova.compute.manager [req-be61a1e8-d1a9-4152-b5ed-b2e50448fb30 req-7f2489b0-7572-4efd-895f-76d5a55c2509 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Processing event network-vif-plugged-61c38173-e58d-41f4-a6e7-aa24531b3ed2 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.248 186666 DEBUG nova.compute.manager [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.253 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.256 186666 INFO nova.virt.libvirt.driver [-] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Instance spawned successfully.
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.257 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.770 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.770 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.770 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.771 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.771 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:27:10 compute-0 nova_compute[186662]: 2026-02-19 19:27:10.771 186666 DEBUG nova.virt.libvirt.driver [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:27:11 compute-0 nova_compute[186662]: 2026-02-19 19:27:11.286 186666 INFO nova.compute.manager [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Took 8.08 seconds to spawn the instance on the hypervisor.
Feb 19 19:27:11 compute-0 nova_compute[186662]: 2026-02-19 19:27:11.286 186666 DEBUG nova.compute.manager [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:27:11 compute-0 nova_compute[186662]: 2026-02-19 19:27:11.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:27:11 compute-0 nova_compute[186662]: 2026-02-19 19:27:11.815 186666 INFO nova.compute.manager [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Took 13.33 seconds to build instance.
Feb 19 19:27:11 compute-0 nova_compute[186662]: 2026-02-19 19:27:11.818 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:12 compute-0 nova_compute[186662]: 2026-02-19 19:27:12.242 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:12 compute-0 nova_compute[186662]: 2026-02-19 19:27:12.288 186666 DEBUG nova.compute.manager [req-551e9204-84e8-454f-8775-ed7aed76746d req-04c08aab-d7ff-4e87-b18a-bafab5f2cae4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Received event network-vif-plugged-61c38173-e58d-41f4-a6e7-aa24531b3ed2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:27:12 compute-0 nova_compute[186662]: 2026-02-19 19:27:12.288 186666 DEBUG oslo_concurrency.lockutils [req-551e9204-84e8-454f-8775-ed7aed76746d req-04c08aab-d7ff-4e87-b18a-bafab5f2cae4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:12 compute-0 nova_compute[186662]: 2026-02-19 19:27:12.288 186666 DEBUG oslo_concurrency.lockutils [req-551e9204-84e8-454f-8775-ed7aed76746d req-04c08aab-d7ff-4e87-b18a-bafab5f2cae4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:12 compute-0 nova_compute[186662]: 2026-02-19 19:27:12.289 186666 DEBUG oslo_concurrency.lockutils [req-551e9204-84e8-454f-8775-ed7aed76746d req-04c08aab-d7ff-4e87-b18a-bafab5f2cae4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:12 compute-0 nova_compute[186662]: 2026-02-19 19:27:12.289 186666 DEBUG nova.compute.manager [req-551e9204-84e8-454f-8775-ed7aed76746d req-04c08aab-d7ff-4e87-b18a-bafab5f2cae4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] No waiting events found dispatching network-vif-plugged-61c38173-e58d-41f4-a6e7-aa24531b3ed2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:27:12 compute-0 nova_compute[186662]: 2026-02-19 19:27:12.289 186666 WARNING nova.compute.manager [req-551e9204-84e8-454f-8775-ed7aed76746d req-04c08aab-d7ff-4e87-b18a-bafab5f2cae4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Received unexpected event network-vif-plugged-61c38173-e58d-41f4-a6e7-aa24531b3ed2 for instance with vm_state active and task_state None.
Feb 19 19:27:12 compute-0 nova_compute[186662]: 2026-02-19 19:27:12.319 186666 DEBUG oslo_concurrency.lockutils [None req-17c39e2d-002e-4e9c-bcff-7884bc950f41 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.843s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:13 compute-0 nova_compute[186662]: 2026-02-19 19:27:13.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:27:14 compute-0 nova_compute[186662]: 2026-02-19 19:27:14.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:27:15 compute-0 nova_compute[186662]: 2026-02-19 19:27:15.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:15 compute-0 nova_compute[186662]: 2026-02-19 19:27:15.087 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:15 compute-0 nova_compute[186662]: 2026-02-19 19:27:15.087 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:15 compute-0 nova_compute[186662]: 2026-02-19 19:27:15.088 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.138 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.218 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.219 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.273 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.280 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.341 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.342 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.414 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.418 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.474 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.474 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.512 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk --force-share --output=json" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.655 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.656 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.671 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.671 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5379MB free_disk=72.91804504394531GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.672 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.672 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:16 compute-0 nova_compute[186662]: 2026-02-19 19:27:16.821 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:17 compute-0 nova_compute[186662]: 2026-02-19 19:27:17.244 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:17 compute-0 nova_compute[186662]: 2026-02-19 19:27:17.854 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 72cc675e-4d5d-48c5-8c12-f9a42e168294 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:27:17 compute-0 nova_compute[186662]: 2026-02-19 19:27:17.855 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance df77e346-76b1-4b06-8611-44d3ac9fc3ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:27:17 compute-0 nova_compute[186662]: 2026-02-19 19:27:17.856 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 9176d9ab-648d-4303-8d54-b208a7e1d395 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:27:17 compute-0 nova_compute[186662]: 2026-02-19 19:27:17.856 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:27:17 compute-0 nova_compute[186662]: 2026-02-19 19:27:17.857 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:27:16 up 58 min,  0 user,  load average: 0.52, 0.31, 0.37\n', 'num_instances': '3', 'num_vm_active': '3', 'num_task_None': '3', 'num_os_type_None': '3', 'num_proj_84d22a8926d9401eb98cf092c0899a62': '3', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:27:17 compute-0 nova_compute[186662]: 2026-02-19 19:27:17.935 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:27:18 compute-0 nova_compute[186662]: 2026-02-19 19:27:18.443 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:27:18 compute-0 nova_compute[186662]: 2026-02-19 19:27:18.954 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:27:18 compute-0 nova_compute[186662]: 2026-02-19 19:27:18.954 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.282s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:21 compute-0 sshd-session[209701]: Received disconnect from 45.148.10.147 port 13969:11:  [preauth]
Feb 19 19:27:21 compute-0 sshd-session[209701]: Disconnected from authenticating user root 45.148.10.147 port 13969 [preauth]
Feb 19 19:27:21 compute-0 nova_compute[186662]: 2026-02-19 19:27:21.825 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:21 compute-0 ovn_controller[96653]: 2026-02-19T19:27:21Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:12:3f:ba 10.100.0.4
Feb 19 19:27:21 compute-0 ovn_controller[96653]: 2026-02-19T19:27:21Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:12:3f:ba 10.100.0.4
Feb 19 19:27:22 compute-0 nova_compute[186662]: 2026-02-19 19:27:22.246 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:23 compute-0 nova_compute[186662]: 2026-02-19 19:27:23.763 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:23 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:23.764 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:27:23 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:23.765 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:27:26 compute-0 podman[209723]: 2026-02-19 19:27:26.291759822 +0000 UTC m=+0.062615477 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 19 19:27:26 compute-0 nova_compute[186662]: 2026-02-19 19:27:26.828 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:27 compute-0 nova_compute[186662]: 2026-02-19 19:27:27.249 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:28.766 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:29 compute-0 podman[209742]: 2026-02-19 19:27:29.302843781 +0000 UTC m=+0.078493678 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, release=1770267347, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-type=git, version=9.7)
Feb 19 19:27:29 compute-0 podman[196025]: time="2026-02-19T19:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:27:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17220 "" "Go-http-client/1.1"
Feb 19 19:27:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2627 "" "Go-http-client/1.1"
Feb 19 19:27:31 compute-0 podman[209764]: 2026-02-19 19:27:31.350174051 +0000 UTC m=+0.117881146 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Feb 19 19:27:31 compute-0 openstack_network_exporter[198916]: ERROR   19:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:27:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:27:31 compute-0 openstack_network_exporter[198916]: ERROR   19:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:27:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:27:31 compute-0 nova_compute[186662]: 2026-02-19 19:27:31.830 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:32.120 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:32.121 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:32.121 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:32 compute-0 nova_compute[186662]: 2026-02-19 19:27:32.251 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:36 compute-0 nova_compute[186662]: 2026-02-19 19:27:36.833 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:37 compute-0 nova_compute[186662]: 2026-02-19 19:27:37.105 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "bff022af-564c-46f6-8756-17ea48c6fdc5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:37 compute-0 nova_compute[186662]: 2026-02-19 19:27:37.105 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:37 compute-0 nova_compute[186662]: 2026-02-19 19:27:37.253 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:37 compute-0 nova_compute[186662]: 2026-02-19 19:27:37.611 186666 DEBUG nova.compute.manager [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Feb 19 19:27:38 compute-0 nova_compute[186662]: 2026-02-19 19:27:38.189 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:38 compute-0 nova_compute[186662]: 2026-02-19 19:27:38.191 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:38 compute-0 nova_compute[186662]: 2026-02-19 19:27:38.199 186666 DEBUG nova.virt.hardware [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:27:38 compute-0 nova_compute[186662]: 2026-02-19 19:27:38.200 186666 INFO nova.compute.claims [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:27:39 compute-0 nova_compute[186662]: 2026-02-19 19:27:39.303 186666 DEBUG nova.compute.provider_tree [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:27:39 compute-0 nova_compute[186662]: 2026-02-19 19:27:39.810 186666 DEBUG nova.scheduler.client.report [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:27:40 compute-0 podman[209805]: 2026-02-19 19:27:40.310383002 +0000 UTC m=+0.072640938 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:27:40 compute-0 nova_compute[186662]: 2026-02-19 19:27:40.321 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.130s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:40 compute-0 nova_compute[186662]: 2026-02-19 19:27:40.322 186666 DEBUG nova.compute.manager [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Feb 19 19:27:40 compute-0 nova_compute[186662]: 2026-02-19 19:27:40.835 186666 DEBUG nova.compute.manager [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Feb 19 19:27:40 compute-0 nova_compute[186662]: 2026-02-19 19:27:40.836 186666 DEBUG nova.network.neutron [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Feb 19 19:27:40 compute-0 nova_compute[186662]: 2026-02-19 19:27:40.836 186666 WARNING neutronclient.v2_0.client [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:27:40 compute-0 nova_compute[186662]: 2026-02-19 19:27:40.837 186666 WARNING neutronclient.v2_0.client [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:27:41 compute-0 nova_compute[186662]: 2026-02-19 19:27:41.345 186666 INFO nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 19 19:27:41 compute-0 nova_compute[186662]: 2026-02-19 19:27:41.723 186666 DEBUG nova.network.neutron [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Successfully created port: d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Feb 19 19:27:41 compute-0 nova_compute[186662]: 2026-02-19 19:27:41.836 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:41 compute-0 nova_compute[186662]: 2026-02-19 19:27:41.854 186666 DEBUG nova.compute.manager [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.230 186666 DEBUG nova.network.neutron [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Successfully updated port: d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.255 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.276 186666 DEBUG nova.compute.manager [req-ab2694f6-f5d7-40de-a553-5eb8e7c4cb98 req-40bcd769-f48e-49ee-8b73-dc625213ac21 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Received event network-changed-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.277 186666 DEBUG nova.compute.manager [req-ab2694f6-f5d7-40de-a553-5eb8e7c4cb98 req-40bcd769-f48e-49ee-8b73-dc625213ac21 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Refreshing instance network info cache due to event network-changed-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.277 186666 DEBUG oslo_concurrency.lockutils [req-ab2694f6-f5d7-40de-a553-5eb8e7c4cb98 req-40bcd769-f48e-49ee-8b73-dc625213ac21 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-bff022af-564c-46f6-8756-17ea48c6fdc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.278 186666 DEBUG oslo_concurrency.lockutils [req-ab2694f6-f5d7-40de-a553-5eb8e7c4cb98 req-40bcd769-f48e-49ee-8b73-dc625213ac21 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-bff022af-564c-46f6-8756-17ea48c6fdc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.278 186666 DEBUG nova.network.neutron [req-ab2694f6-f5d7-40de-a553-5eb8e7c4cb98 req-40bcd769-f48e-49ee-8b73-dc625213ac21 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Refreshing network info cache for port d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.736 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "refresh_cache-bff022af-564c-46f6-8756-17ea48c6fdc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.785 186666 WARNING neutronclient.v2_0.client [req-ab2694f6-f5d7-40de-a553-5eb8e7c4cb98 req-40bcd769-f48e-49ee-8b73-dc625213ac21 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.874 186666 DEBUG nova.compute.manager [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.876 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.877 186666 INFO nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Creating image(s)
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.877 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "/var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.878 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "/var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.879 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "/var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.880 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.886 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.887 186666 DEBUG oslo_concurrency.processutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.927 186666 DEBUG oslo_concurrency.processutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.929 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.930 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.931 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.937 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:27:42 compute-0 nova_compute[186662]: 2026-02-19 19:27:42.938 186666 DEBUG oslo_concurrency.processutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.008 186666 DEBUG oslo_concurrency.processutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.012 186666 DEBUG oslo_concurrency.processutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.042 186666 DEBUG oslo_concurrency.processutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.043 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.043 186666 DEBUG oslo_concurrency.processutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.093 186666 DEBUG oslo_concurrency.processutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.094 186666 DEBUG nova.virt.disk.api [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Checking if we can resize image /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.095 186666 DEBUG oslo_concurrency.processutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.141 186666 DEBUG oslo_concurrency.processutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.142 186666 DEBUG nova.virt.disk.api [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Cannot resize image /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.143 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.143 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Ensure instance console log exists: /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.144 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.144 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.145 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.304 186666 DEBUG nova.network.neutron [req-ab2694f6-f5d7-40de-a553-5eb8e7c4cb98 req-40bcd769-f48e-49ee-8b73-dc625213ac21 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.465 186666 DEBUG nova.network.neutron [req-ab2694f6-f5d7-40de-a553-5eb8e7c4cb98 req-40bcd769-f48e-49ee-8b73-dc625213ac21 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.972 186666 DEBUG oslo_concurrency.lockutils [req-ab2694f6-f5d7-40de-a553-5eb8e7c4cb98 req-40bcd769-f48e-49ee-8b73-dc625213ac21 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-bff022af-564c-46f6-8756-17ea48c6fdc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.972 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquired lock "refresh_cache-bff022af-564c-46f6-8756-17ea48c6fdc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:27:43 compute-0 nova_compute[186662]: 2026-02-19 19:27:43.973 186666 DEBUG nova.network.neutron [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:27:45 compute-0 nova_compute[186662]: 2026-02-19 19:27:45.182 186666 DEBUG nova.network.neutron [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:27:45 compute-0 nova_compute[186662]: 2026-02-19 19:27:45.417 186666 WARNING neutronclient.v2_0.client [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:27:46 compute-0 nova_compute[186662]: 2026-02-19 19:27:46.559 186666 DEBUG nova.network.neutron [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Updating instance_info_cache with network_info: [{"id": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "address": "fa:16:3e:9a:66:f9", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd89a6d19-fd", "ovs_interfaceid": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:27:46 compute-0 nova_compute[186662]: 2026-02-19 19:27:46.840 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.068 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Releasing lock "refresh_cache-bff022af-564c-46f6-8756-17ea48c6fdc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.069 186666 DEBUG nova.compute.manager [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Instance network_info: |[{"id": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "address": "fa:16:3e:9a:66:f9", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd89a6d19-fd", "ovs_interfaceid": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.071 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Start _get_guest_xml network_info=[{"id": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "address": "fa:16:3e:9a:66:f9", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd89a6d19-fd", "ovs_interfaceid": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.075 186666 WARNING nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.076 186666 DEBUG nova.virt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-310914281', uuid='bff022af-564c-46f6-8756-17ea48c6fdc5'), owner=OwnerMeta(userid='af924faf672a45b4b8708466af6eeb12', username='tempest-TestExecuteActionsViaActuator-1567565925-project-admin', projectid='84d22a8926d9401eb98cf092c0899a62', projectname='tempest-TestExecuteActionsViaActuator-1567565925'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "address": "fa:16:3e:9a:66:f9", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd89a6d19-fd", "ovs_interfaceid": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771529267.0768898) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.081 186666 DEBUG nova.virt.libvirt.host [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.081 186666 DEBUG nova.virt.libvirt.host [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.084 186666 DEBUG nova.virt.libvirt.host [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.085 186666 DEBUG nova.virt.libvirt.host [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.086 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.086 186666 DEBUG nova.virt.hardware [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:33Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.086 186666 DEBUG nova.virt.hardware [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.087 186666 DEBUG nova.virt.hardware [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.087 186666 DEBUG nova.virt.hardware [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.087 186666 DEBUG nova.virt.hardware [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.088 186666 DEBUG nova.virt.hardware [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.088 186666 DEBUG nova.virt.hardware [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.088 186666 DEBUG nova.virt.hardware [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.088 186666 DEBUG nova.virt.hardware [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.089 186666 DEBUG nova.virt.hardware [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.089 186666 DEBUG nova.virt.hardware [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.092 186666 DEBUG nova.virt.libvirt.vif [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:27:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-310914281',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-310914281',id=9,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-xyjvx1b2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:27:41Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=bff022af-564c-46f6-8756-17ea48c6fdc5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "address": "fa:16:3e:9a:66:f9", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd89a6d19-fd", "ovs_interfaceid": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.092 186666 DEBUG nova.network.os_vif_util [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converting VIF {"id": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "address": "fa:16:3e:9a:66:f9", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd89a6d19-fd", "ovs_interfaceid": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.093 186666 DEBUG nova.network.os_vif_util [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:66:f9,bridge_name='br-int',has_traffic_filtering=True,id=d89a6d19-fdc2-4b79-baf7-7f00bdc698b6,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd89a6d19-fd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.094 186666 DEBUG nova.objects.instance [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lazy-loading 'pci_devices' on Instance uuid bff022af-564c-46f6-8756-17ea48c6fdc5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.256 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.601 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:27:47 compute-0 nova_compute[186662]:   <uuid>bff022af-564c-46f6-8756-17ea48c6fdc5</uuid>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   <name>instance-00000009</name>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   <memory>131072</memory>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-310914281</nova:name>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:27:47</nova:creationTime>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:27:47 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:27:47 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:user uuid="af924faf672a45b4b8708466af6eeb12">tempest-TestExecuteActionsViaActuator-1567565925-project-admin</nova:user>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:project uuid="84d22a8926d9401eb98cf092c0899a62">tempest-TestExecuteActionsViaActuator-1567565925</nova:project>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         <nova:port uuid="d89a6d19-fdc2-4b79-baf7-7f00bdc698b6">
Feb 19 19:27:47 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <system>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <entry name="serial">bff022af-564c-46f6-8756-17ea48c6fdc5</entry>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <entry name="uuid">bff022af-564c-46f6-8756-17ea48c6fdc5</entry>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     </system>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   <os>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   </os>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   <features>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   </features>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk.config"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:9a:66:f9"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <target dev="tapd89a6d19-fd"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/console.log" append="off"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <video>
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     </video>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:27:47 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:27:47 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:27:47 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:27:47 compute-0 nova_compute[186662]: </domain>
Feb 19 19:27:47 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.603 186666 DEBUG nova.compute.manager [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Preparing to wait for external event network-vif-plugged-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.603 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.604 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.604 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.606 186666 DEBUG nova.virt.libvirt.vif [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:27:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-310914281',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-310914281',id=9,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-xyjvx1b2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:27:41Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=bff022af-564c-46f6-8756-17ea48c6fdc5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "address": "fa:16:3e:9a:66:f9", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd89a6d19-fd", "ovs_interfaceid": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.606 186666 DEBUG nova.network.os_vif_util [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converting VIF {"id": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "address": "fa:16:3e:9a:66:f9", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd89a6d19-fd", "ovs_interfaceid": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.608 186666 DEBUG nova.network.os_vif_util [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:66:f9,bridge_name='br-int',has_traffic_filtering=True,id=d89a6d19-fdc2-4b79-baf7-7f00bdc698b6,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd89a6d19-fd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.608 186666 DEBUG os_vif [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:66:f9,bridge_name='br-int',has_traffic_filtering=True,id=d89a6d19-fdc2-4b79-baf7-7f00bdc698b6,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd89a6d19-fd') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.609 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.610 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.611 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.612 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.613 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '98770403-23ce-58d8-a84e-4af0cf8fa7fe', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.616 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.620 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.620 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd89a6d19-fd, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.621 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd89a6d19-fd, col_values=(('qos', UUID('9d04e2d7-5749-47ca-bf86-5d70c2a3d684')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.622 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd89a6d19-fd, col_values=(('external_ids', {'iface-id': 'd89a6d19-fdc2-4b79-baf7-7f00bdc698b6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:66:f9', 'vm-uuid': 'bff022af-564c-46f6-8756-17ea48c6fdc5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.623 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:47 compute-0 NetworkManager[56519]: <info>  [1771529267.6247] manager: (tapd89a6d19-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.629 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.630 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:47 compute-0 nova_compute[186662]: 2026-02-19 19:27:47.631 186666 INFO os_vif [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:66:f9,bridge_name='br-int',has_traffic_filtering=True,id=d89a6d19-fdc2-4b79-baf7-7f00bdc698b6,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd89a6d19-fd')
Feb 19 19:27:49 compute-0 nova_compute[186662]: 2026-02-19 19:27:49.168 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:27:49 compute-0 nova_compute[186662]: 2026-02-19 19:27:49.169 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:27:49 compute-0 nova_compute[186662]: 2026-02-19 19:27:49.170 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] No VIF found with MAC fa:16:3e:9a:66:f9, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:27:49 compute-0 nova_compute[186662]: 2026-02-19 19:27:49.171 186666 INFO nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Using config drive
Feb 19 19:27:49 compute-0 nova_compute[186662]: 2026-02-19 19:27:49.685 186666 WARNING neutronclient.v2_0.client [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.343 186666 INFO nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Creating config drive at /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk.config
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.350 186666 DEBUG oslo_concurrency.processutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpuqnb8mdw execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.480 186666 DEBUG oslo_concurrency.processutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpuqnb8mdw" returned: 0 in 0.129s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:27:50 compute-0 kernel: tapd89a6d19-fd: entered promiscuous mode
Feb 19 19:27:50 compute-0 NetworkManager[56519]: <info>  [1771529270.5383] manager: (tapd89a6d19-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Feb 19 19:27:50 compute-0 ovn_controller[96653]: 2026-02-19T19:27:50Z|00062|binding|INFO|Claiming lport d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 for this chassis.
Feb 19 19:27:50 compute-0 ovn_controller[96653]: 2026-02-19T19:27:50Z|00063|binding|INFO|d89a6d19-fdc2-4b79-baf7-7f00bdc698b6: Claiming fa:16:3e:9a:66:f9 10.100.0.10
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.540 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.549 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:66:f9 10.100.0.10'], port_security=['fa:16:3e:9a:66:f9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bff022af-564c-46f6-8756-17ea48c6fdc5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '4', 'neutron:security_group_ids': '286c3adc-3464-487a-ae0e-8231188cef8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46bbe139-eec8-4de4-bfdc-b507967fb452, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=d89a6d19-fdc2-4b79-baf7-7f00bdc698b6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:27:50 compute-0 ovn_controller[96653]: 2026-02-19T19:27:50Z|00064|binding|INFO|Setting lport d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 ovn-installed in OVS
Feb 19 19:27:50 compute-0 ovn_controller[96653]: 2026-02-19T19:27:50Z|00065|binding|INFO|Setting lport d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 up in Southbound
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.550 105986 INFO neutron.agent.ovn.metadata.agent [-] Port d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 in datapath 23744514-9581-483b-ba8d-38106bcd89ef bound to our chassis
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.552 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 23744514-9581-483b-ba8d-38106bcd89ef
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.552 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.566 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9b6b5f-4a99-4dea-ad40-229461597fc5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:50 compute-0 systemd-machined[156014]: New machine qemu-5-instance-00000009.
Feb 19 19:27:50 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000009.
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.594 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a56e85-a1f7-4790-a501-ce8a2ee29484]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.596 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed74d64-dac5-4bbd-bd64-7b8c67529bce]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:50 compute-0 systemd-udevd[209868]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:27:50 compute-0 NetworkManager[56519]: <info>  [1771529270.6097] device (tapd89a6d19-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:27:50 compute-0 NetworkManager[56519]: <info>  [1771529270.6105] device (tapd89a6d19-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.620 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[b4c6f8df-bfde-41bf-a8ed-6fafe5c1a59a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.635 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a7d6b1-2659-4a46-84a6-340044eb5c88]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23744514-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:f0:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341570, 'reachable_time': 39245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209878, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.648 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[53f75374-ade8-402f-9921-f800ba401728]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341576, 'tstamp': 341576}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209879, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341578, 'tstamp': 341578}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209879, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.649 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23744514-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.651 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.651 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.652 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23744514-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.652 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.652 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap23744514-90, col_values=(('external_ids', {'iface-id': '25b16785-ed71-4ba6-a91b-c4dcee5ff875'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.653 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:27:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:27:50.654 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc000c8-d58e-4976-8950-ffcb15a6420d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-23744514-9581-483b-ba8d-38106bcd89ef\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 23744514-9581-483b-ba8d-38106bcd89ef\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.715 186666 DEBUG nova.compute.manager [req-d090654e-c6aa-4816-8d26-4a09ffd6664b req-db6c26fa-5ebe-4854-a7a3-b303b44c60d1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Received event network-vif-plugged-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.716 186666 DEBUG oslo_concurrency.lockutils [req-d090654e-c6aa-4816-8d26-4a09ffd6664b req-db6c26fa-5ebe-4854-a7a3-b303b44c60d1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.716 186666 DEBUG oslo_concurrency.lockutils [req-d090654e-c6aa-4816-8d26-4a09ffd6664b req-db6c26fa-5ebe-4854-a7a3-b303b44c60d1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.716 186666 DEBUG oslo_concurrency.lockutils [req-d090654e-c6aa-4816-8d26-4a09ffd6664b req-db6c26fa-5ebe-4854-a7a3-b303b44c60d1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.717 186666 DEBUG nova.compute.manager [req-d090654e-c6aa-4816-8d26-4a09ffd6664b req-db6c26fa-5ebe-4854-a7a3-b303b44c60d1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Processing event network-vif-plugged-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.809 186666 DEBUG nova.compute.manager [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.812 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.816 186666 INFO nova.virt.libvirt.driver [-] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Instance spawned successfully.
Feb 19 19:27:50 compute-0 nova_compute[186662]: 2026-02-19 19:27:50.817 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Feb 19 19:27:51 compute-0 nova_compute[186662]: 2026-02-19 19:27:51.330 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:27:51 compute-0 nova_compute[186662]: 2026-02-19 19:27:51.331 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:27:51 compute-0 nova_compute[186662]: 2026-02-19 19:27:51.332 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:27:51 compute-0 nova_compute[186662]: 2026-02-19 19:27:51.333 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:27:51 compute-0 nova_compute[186662]: 2026-02-19 19:27:51.334 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:27:51 compute-0 nova_compute[186662]: 2026-02-19 19:27:51.335 186666 DEBUG nova.virt.libvirt.driver [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:27:51 compute-0 nova_compute[186662]: 2026-02-19 19:27:51.846 186666 INFO nova.compute.manager [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Took 8.97 seconds to spawn the instance on the hypervisor.
Feb 19 19:27:51 compute-0 nova_compute[186662]: 2026-02-19 19:27:51.847 186666 DEBUG nova.compute.manager [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:27:52 compute-0 nova_compute[186662]: 2026-02-19 19:27:52.257 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:52 compute-0 nova_compute[186662]: 2026-02-19 19:27:52.389 186666 INFO nova.compute.manager [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Took 14.27 seconds to build instance.
Feb 19 19:27:52 compute-0 nova_compute[186662]: 2026-02-19 19:27:52.646 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:52 compute-0 nova_compute[186662]: 2026-02-19 19:27:52.784 186666 DEBUG nova.compute.manager [req-9c3d3fc8-83a0-47f8-bd37-1794c321223a req-58f2d577-924e-4d9d-a81f-0566a528601c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Received event network-vif-plugged-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:27:52 compute-0 nova_compute[186662]: 2026-02-19 19:27:52.784 186666 DEBUG oslo_concurrency.lockutils [req-9c3d3fc8-83a0-47f8-bd37-1794c321223a req-58f2d577-924e-4d9d-a81f-0566a528601c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:27:52 compute-0 nova_compute[186662]: 2026-02-19 19:27:52.784 186666 DEBUG oslo_concurrency.lockutils [req-9c3d3fc8-83a0-47f8-bd37-1794c321223a req-58f2d577-924e-4d9d-a81f-0566a528601c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:27:52 compute-0 nova_compute[186662]: 2026-02-19 19:27:52.785 186666 DEBUG oslo_concurrency.lockutils [req-9c3d3fc8-83a0-47f8-bd37-1794c321223a req-58f2d577-924e-4d9d-a81f-0566a528601c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:52 compute-0 nova_compute[186662]: 2026-02-19 19:27:52.785 186666 DEBUG nova.compute.manager [req-9c3d3fc8-83a0-47f8-bd37-1794c321223a req-58f2d577-924e-4d9d-a81f-0566a528601c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] No waiting events found dispatching network-vif-plugged-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:27:52 compute-0 nova_compute[186662]: 2026-02-19 19:27:52.785 186666 WARNING nova.compute.manager [req-9c3d3fc8-83a0-47f8-bd37-1794c321223a req-58f2d577-924e-4d9d-a81f-0566a528601c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Received unexpected event network-vif-plugged-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 for instance with vm_state active and task_state None.
Feb 19 19:27:52 compute-0 nova_compute[186662]: 2026-02-19 19:27:52.895 186666 DEBUG oslo_concurrency.lockutils [None req-668238a1-6be6-4cc9-b1fc-fcd60f061bab af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.790s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:27:57 compute-0 nova_compute[186662]: 2026-02-19 19:27:57.259 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:57 compute-0 podman[209888]: 2026-02-19 19:27:57.27918962 +0000 UTC m=+0.052971395 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 19 19:27:57 compute-0 nova_compute[186662]: 2026-02-19 19:27:57.647 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:27:59 compute-0 podman[196025]: time="2026-02-19T19:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:27:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17220 "" "Go-http-client/1.1"
Feb 19 19:27:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2632 "" "Go-http-client/1.1"
Feb 19 19:28:00 compute-0 podman[209907]: 2026-02-19 19:28:00.267245426 +0000 UTC m=+0.040219138 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1770267347, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Feb 19 19:28:01 compute-0 openstack_network_exporter[198916]: ERROR   19:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:28:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:28:01 compute-0 openstack_network_exporter[198916]: ERROR   19:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:28:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:28:02 compute-0 nova_compute[186662]: 2026-02-19 19:28:02.223 186666 DEBUG nova.compute.manager [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6173
Feb 19 19:28:02 compute-0 nova_compute[186662]: 2026-02-19 19:28:02.261 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:02 compute-0 podman[209930]: 2026-02-19 19:28:02.307873676 +0000 UTC m=+0.084891352 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Feb 19 19:28:02 compute-0 nova_compute[186662]: 2026-02-19 19:28:02.648 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:02 compute-0 nova_compute[186662]: 2026-02-19 19:28:02.786 186666 DEBUG oslo_concurrency.lockutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:02 compute-0 nova_compute[186662]: 2026-02-19 19:28:02.787 186666 DEBUG oslo_concurrency.lockutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:03 compute-0 nova_compute[186662]: 2026-02-19 19:28:03.297 186666 DEBUG nova.objects.instance [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'pci_requests' on Instance uuid 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:28:03 compute-0 nova_compute[186662]: 2026-02-19 19:28:03.521 186666 DEBUG nova.virt.libvirt.driver [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Creating tmpfile /var/lib/nova/instances/tmpnb5xsrnt to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Feb 19 19:28:03 compute-0 nova_compute[186662]: 2026-02-19 19:28:03.522 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:03 compute-0 nova_compute[186662]: 2026-02-19 19:28:03.627 186666 DEBUG nova.compute.manager [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnb5xsrnt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Feb 19 19:28:03 compute-0 nova_compute[186662]: 2026-02-19 19:28:03.807 186666 DEBUG nova.virt.hardware [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:28:03 compute-0 nova_compute[186662]: 2026-02-19 19:28:03.807 186666 INFO nova.compute.claims [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:28:03 compute-0 nova_compute[186662]: 2026-02-19 19:28:03.808 186666 DEBUG nova.objects.instance [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'resources' on Instance uuid 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:28:04 compute-0 nova_compute[186662]: 2026-02-19 19:28:04.314 186666 DEBUG nova.objects.base [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<9dfd1282-8be2-45bd-ad6c-5b8c9be761bc> lazy-loaded attributes: pci_requests,resources wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:28:04 compute-0 nova_compute[186662]: 2026-02-19 19:28:04.315 186666 DEBUG nova.objects.instance [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:28:04 compute-0 ovn_controller[96653]: 2026-02-19T19:28:04Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9a:66:f9 10.100.0.10
Feb 19 19:28:04 compute-0 ovn_controller[96653]: 2026-02-19T19:28:04Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9a:66:f9 10.100.0.10
Feb 19 19:28:04 compute-0 nova_compute[186662]: 2026-02-19 19:28:04.821 186666 DEBUG nova.objects.base [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<9dfd1282-8be2-45bd-ad6c-5b8c9be761bc> lazy-loaded attributes: pci_requests,resources,numa_topology wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:28:04 compute-0 nova_compute[186662]: 2026-02-19 19:28:04.822 186666 DEBUG nova.objects.instance [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:28:05 compute-0 nova_compute[186662]: 2026-02-19 19:28:05.329 186666 DEBUG nova.objects.base [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<9dfd1282-8be2-45bd-ad6c-5b8c9be761bc> lazy-loaded attributes: pci_requests,resources,numa_topology,pci_devices wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:28:05 compute-0 nova_compute[186662]: 2026-02-19 19:28:05.690 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:05 compute-0 nova_compute[186662]: 2026-02-19 19:28:05.841 186666 INFO nova.compute.resource_tracker [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Updating resource usage from migration ec57da9f-7261-472d-acdf-7cca20c774a7
Feb 19 19:28:05 compute-0 nova_compute[186662]: 2026-02-19 19:28:05.842 186666 DEBUG nova.compute.resource_tracker [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Starting to track incoming migration ec57da9f-7261-472d-acdf-7cca20c774a7 with flavor 3881472c-99fb-4fe5-ab4d-bf6223e45537 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Feb 19 19:28:06 compute-0 nova_compute[186662]: 2026-02-19 19:28:06.512 186666 DEBUG nova.compute.provider_tree [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:28:07 compute-0 nova_compute[186662]: 2026-02-19 19:28:07.018 186666 DEBUG nova.scheduler.client.report [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:28:07 compute-0 nova_compute[186662]: 2026-02-19 19:28:07.263 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:07 compute-0 nova_compute[186662]: 2026-02-19 19:28:07.531 186666 DEBUG oslo_concurrency.lockutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 4.744s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:07 compute-0 nova_compute[186662]: 2026-02-19 19:28:07.531 186666 INFO nova.compute.manager [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Migrating
Feb 19 19:28:07 compute-0 nova_compute[186662]: 2026-02-19 19:28:07.651 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:09 compute-0 nova_compute[186662]: 2026-02-19 19:28:09.514 186666 DEBUG nova.compute.manager [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnb5xsrnt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7874c50f-bdd7-48d9-bbab-6db09f173178',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Feb 19 19:28:09 compute-0 nova_compute[186662]: 2026-02-19 19:28:09.955 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:28:09 compute-0 nova_compute[186662]: 2026-02-19 19:28:09.955 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:28:09 compute-0 nova_compute[186662]: 2026-02-19 19:28:09.955 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:28:09 compute-0 nova_compute[186662]: 2026-02-19 19:28:09.956 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:28:09 compute-0 nova_compute[186662]: 2026-02-19 19:28:09.956 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:28:10 compute-0 nova_compute[186662]: 2026-02-19 19:28:10.528 186666 DEBUG oslo_concurrency.lockutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-7874c50f-bdd7-48d9-bbab-6db09f173178" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:28:10 compute-0 nova_compute[186662]: 2026-02-19 19:28:10.530 186666 DEBUG oslo_concurrency.lockutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-7874c50f-bdd7-48d9-bbab-6db09f173178" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:28:10 compute-0 nova_compute[186662]: 2026-02-19 19:28:10.530 186666 DEBUG nova.network.neutron [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:28:10 compute-0 nova_compute[186662]: 2026-02-19 19:28:10.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:28:11 compute-0 nova_compute[186662]: 2026-02-19 19:28:11.038 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:11 compute-0 podman[209972]: 2026-02-19 19:28:11.275978696 +0000 UTC m=+0.052518662 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:28:11 compute-0 nova_compute[186662]: 2026-02-19 19:28:11.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:28:11 compute-0 nova_compute[186662]: 2026-02-19 19:28:11.910 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:12 compute-0 nova_compute[186662]: 2026-02-19 19:28:12.130 186666 DEBUG nova.network.neutron [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Updating instance_info_cache with network_info: [{"id": "73ee194b-90a5-4834-81db-341af15d3ad4", "address": "fa:16:3e:41:d2:d7", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ee194b-90", "ovs_interfaceid": "73ee194b-90a5-4834-81db-341af15d3ad4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:28:12 compute-0 nova_compute[186662]: 2026-02-19 19:28:12.266 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:12 compute-0 nova_compute[186662]: 2026-02-19 19:28:12.639 186666 DEBUG oslo_concurrency.lockutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-7874c50f-bdd7-48d9-bbab-6db09f173178" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:28:12 compute-0 nova_compute[186662]: 2026-02-19 19:28:12.651 186666 DEBUG nova.virt.libvirt.driver [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnb5xsrnt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7874c50f-bdd7-48d9-bbab-6db09f173178',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Feb 19 19:28:12 compute-0 nova_compute[186662]: 2026-02-19 19:28:12.652 186666 DEBUG nova.virt.libvirt.driver [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Creating instance directory: /var/lib/nova/instances/7874c50f-bdd7-48d9-bbab-6db09f173178 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Feb 19 19:28:12 compute-0 nova_compute[186662]: 2026-02-19 19:28:12.652 186666 DEBUG nova.virt.libvirt.driver [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Creating disk.info with the contents: {'/var/lib/nova/instances/7874c50f-bdd7-48d9-bbab-6db09f173178/disk': 'qcow2', '/var/lib/nova/instances/7874c50f-bdd7-48d9-bbab-6db09f173178/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Feb 19 19:28:12 compute-0 nova_compute[186662]: 2026-02-19 19:28:12.652 186666 DEBUG nova.virt.libvirt.driver [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Feb 19 19:28:12 compute-0 nova_compute[186662]: 2026-02-19 19:28:12.653 186666 DEBUG nova.objects.instance [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7874c50f-bdd7-48d9-bbab-6db09f173178 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:28:12 compute-0 nova_compute[186662]: 2026-02-19 19:28:12.654 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:12 compute-0 sshd-session[209997]: Accepted publickey for nova from 192.168.122.101 port 35618 ssh2: ECDSA SHA256:liNwzzNsv7m7uKbs0CZWtaLfd+C4L5fwY2+6uBHzm7c
Feb 19 19:28:12 compute-0 systemd-logind[822]: New session 34 of user nova.
Feb 19 19:28:12 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 19 19:28:12 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 19 19:28:12 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 19 19:28:12 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 19 19:28:12 compute-0 systemd[210001]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 19 19:28:12 compute-0 systemd[210001]: Queued start job for default target Main User Target.
Feb 19 19:28:12 compute-0 systemd[210001]: Created slice User Application Slice.
Feb 19 19:28:12 compute-0 systemd[210001]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 19 19:28:12 compute-0 systemd[210001]: Started Daily Cleanup of User's Temporary Directories.
Feb 19 19:28:12 compute-0 systemd[210001]: Reached target Paths.
Feb 19 19:28:12 compute-0 systemd[210001]: Reached target Timers.
Feb 19 19:28:12 compute-0 systemd[210001]: Starting D-Bus User Message Bus Socket...
Feb 19 19:28:12 compute-0 systemd[210001]: Starting Create User's Volatile Files and Directories...
Feb 19 19:28:12 compute-0 systemd[210001]: Finished Create User's Volatile Files and Directories.
Feb 19 19:28:12 compute-0 systemd[210001]: Listening on D-Bus User Message Bus Socket.
Feb 19 19:28:12 compute-0 systemd[210001]: Reached target Sockets.
Feb 19 19:28:12 compute-0 systemd[210001]: Reached target Basic System.
Feb 19 19:28:12 compute-0 systemd[210001]: Reached target Main User Target.
Feb 19 19:28:12 compute-0 systemd[210001]: Startup finished in 146ms.
Feb 19 19:28:12 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 19 19:28:12 compute-0 systemd[1]: Started Session 34 of User nova.
Feb 19 19:28:12 compute-0 sshd-session[209997]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 19 19:28:13 compute-0 sshd-session[210017]: Received disconnect from 192.168.122.101 port 35618:11: disconnected by user
Feb 19 19:28:13 compute-0 sshd-session[210017]: Disconnected from user nova 192.168.122.101 port 35618
Feb 19 19:28:13 compute-0 sshd-session[209997]: pam_unix(sshd:session): session closed for user nova
Feb 19 19:28:13 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Feb 19 19:28:13 compute-0 systemd-logind[822]: Session 34 logged out. Waiting for processes to exit.
Feb 19 19:28:13 compute-0 systemd-logind[822]: Removed session 34.
Feb 19 19:28:13 compute-0 sshd-session[210019]: Accepted publickey for nova from 192.168.122.101 port 35626 ssh2: ECDSA SHA256:liNwzzNsv7m7uKbs0CZWtaLfd+C4L5fwY2+6uBHzm7c
Feb 19 19:28:13 compute-0 systemd-logind[822]: New session 36 of user nova.
Feb 19 19:28:13 compute-0 systemd[1]: Started Session 36 of User nova.
Feb 19 19:28:13 compute-0 sshd-session[210019]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.159 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.163 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.165 186666 DEBUG oslo_concurrency.processutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:13 compute-0 sshd-session[210022]: Received disconnect from 192.168.122.101 port 35626:11: disconnected by user
Feb 19 19:28:13 compute-0 sshd-session[210022]: Disconnected from user nova 192.168.122.101 port 35626
Feb 19 19:28:13 compute-0 sshd-session[210019]: pam_unix(sshd:session): session closed for user nova
Feb 19 19:28:13 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Feb 19 19:28:13 compute-0 systemd-logind[822]: Session 36 logged out. Waiting for processes to exit.
Feb 19 19:28:13 compute-0 systemd-logind[822]: Removed session 36.
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.207 186666 DEBUG oslo_concurrency.processutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.208 186666 DEBUG oslo_concurrency.lockutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.208 186666 DEBUG oslo_concurrency.lockutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.209 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.212 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.212 186666 DEBUG oslo_concurrency.processutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.280 186666 DEBUG oslo_concurrency.processutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.281 186666 DEBUG oslo_concurrency.processutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/7874c50f-bdd7-48d9-bbab-6db09f173178/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.311 186666 DEBUG oslo_concurrency.processutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/7874c50f-bdd7-48d9-bbab-6db09f173178/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.311 186666 DEBUG oslo_concurrency.lockutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.312 186666 DEBUG oslo_concurrency.processutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.349 186666 DEBUG oslo_concurrency.processutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.349 186666 DEBUG nova.virt.disk.api [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Checking if we can resize image /var/lib/nova/instances/7874c50f-bdd7-48d9-bbab-6db09f173178/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.350 186666 DEBUG oslo_concurrency.processutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7874c50f-bdd7-48d9-bbab-6db09f173178/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.407 186666 DEBUG oslo_concurrency.processutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7874c50f-bdd7-48d9-bbab-6db09f173178/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.408 186666 DEBUG nova.virt.disk.api [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Cannot resize image /var/lib/nova/instances/7874c50f-bdd7-48d9-bbab-6db09f173178/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.408 186666 DEBUG nova.objects.instance [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid 7874c50f-bdd7-48d9-bbab-6db09f173178 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.915 186666 DEBUG nova.objects.base [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<7874c50f-bdd7-48d9-bbab-6db09f173178> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.916 186666 DEBUG oslo_concurrency.processutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7874c50f-bdd7-48d9-bbab-6db09f173178/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.942 186666 DEBUG oslo_concurrency.processutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7874c50f-bdd7-48d9-bbab-6db09f173178/disk.config 497664" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.943 186666 DEBUG nova.virt.libvirt.driver [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.944 186666 DEBUG nova.virt.libvirt.vif [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-02-19T19:26:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-105068416',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-105068416',id=6,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:26:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-nno39c23',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:26:51Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=7874c50f-bdd7-48d9-bbab-6db09f173178,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "73ee194b-90a5-4834-81db-341af15d3ad4", "address": "fa:16:3e:41:d2:d7", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap73ee194b-90", "ovs_interfaceid": "73ee194b-90a5-4834-81db-341af15d3ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.944 186666 DEBUG nova.network.os_vif_util [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "73ee194b-90a5-4834-81db-341af15d3ad4", "address": "fa:16:3e:41:d2:d7", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap73ee194b-90", "ovs_interfaceid": "73ee194b-90a5-4834-81db-341af15d3ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.945 186666 DEBUG nova.network.os_vif_util [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:d2:d7,bridge_name='br-int',has_traffic_filtering=True,id=73ee194b-90a5-4834-81db-341af15d3ad4,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ee194b-90') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.945 186666 DEBUG os_vif [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:d2:d7,bridge_name='br-int',has_traffic_filtering=True,id=73ee194b-90a5-4834-81db-341af15d3ad4,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ee194b-90') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.946 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.946 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.946 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.947 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.947 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '6ddb09c2-47d8-5170-af75-6c6bd8bd70d9', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.949 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.950 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.951 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.954 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.954 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap73ee194b-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.955 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap73ee194b-90, col_values=(('qos', UUID('3909986a-e279-40ac-af95-1b48ac5c3444')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.955 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap73ee194b-90, col_values=(('external_ids', {'iface-id': '73ee194b-90a5-4834-81db-341af15d3ad4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:41:d2:d7', 'vm-uuid': '7874c50f-bdd7-48d9-bbab-6db09f173178'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.956 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:13 compute-0 NetworkManager[56519]: <info>  [1771529293.9574] manager: (tap73ee194b-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.958 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.964 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.966 186666 INFO os_vif [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:d2:d7,bridge_name='br-int',has_traffic_filtering=True,id=73ee194b-90a5-4834-81db-341af15d3ad4,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ee194b-90')
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.967 186666 DEBUG nova.virt.libvirt.driver [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.967 186666 DEBUG nova.compute.manager [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnb5xsrnt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7874c50f-bdd7-48d9-bbab-6db09f173178',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Feb 19 19:28:13 compute-0 nova_compute[186662]: 2026-02-19 19:28:13.968 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:14 compute-0 nova_compute[186662]: 2026-02-19 19:28:14.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:28:14 compute-0 nova_compute[186662]: 2026-02-19 19:28:14.584 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:14 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:14.943 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:28:14 compute-0 nova_compute[186662]: 2026-02-19 19:28:14.944 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:14 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:14.945 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:28:14 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:14.946 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:15 compute-0 nova_compute[186662]: 2026-02-19 19:28:15.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:15 compute-0 nova_compute[186662]: 2026-02-19 19:28:15.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:15 compute-0 nova_compute[186662]: 2026-02-19 19:28:15.091 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:15 compute-0 nova_compute[186662]: 2026-02-19 19:28:15.091 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:28:15 compute-0 nova_compute[186662]: 2026-02-19 19:28:15.391 186666 DEBUG nova.network.neutron [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Port 73ee194b-90a5-4834-81db-341af15d3ad4 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Feb 19 19:28:15 compute-0 nova_compute[186662]: 2026-02-19 19:28:15.401 186666 DEBUG nova.compute.manager [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnb5xsrnt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7874c50f-bdd7-48d9-bbab-6db09f173178',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Feb 19 19:28:15 compute-0 nova_compute[186662]: 2026-02-19 19:28:15.587 186666 DEBUG nova.compute.manager [req-731e91e1-ba65-47e4-8a22-b4709c51007c req-0cde05e1-3a06-4469-bf6c-20146cbcc89f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received event network-vif-unplugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:28:15 compute-0 nova_compute[186662]: 2026-02-19 19:28:15.587 186666 DEBUG oslo_concurrency.lockutils [req-731e91e1-ba65-47e4-8a22-b4709c51007c req-0cde05e1-3a06-4469-bf6c-20146cbcc89f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:15 compute-0 nova_compute[186662]: 2026-02-19 19:28:15.587 186666 DEBUG oslo_concurrency.lockutils [req-731e91e1-ba65-47e4-8a22-b4709c51007c req-0cde05e1-3a06-4469-bf6c-20146cbcc89f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:15 compute-0 nova_compute[186662]: 2026-02-19 19:28:15.588 186666 DEBUG oslo_concurrency.lockutils [req-731e91e1-ba65-47e4-8a22-b4709c51007c req-0cde05e1-3a06-4469-bf6c-20146cbcc89f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:15 compute-0 nova_compute[186662]: 2026-02-19 19:28:15.588 186666 DEBUG nova.compute.manager [req-731e91e1-ba65-47e4-8a22-b4709c51007c req-0cde05e1-3a06-4469-bf6c-20146cbcc89f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] No waiting events found dispatching network-vif-unplugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:28:15 compute-0 nova_compute[186662]: 2026-02-19 19:28:15.588 186666 WARNING nova.compute.manager [req-731e91e1-ba65-47e4-8a22-b4709c51007c req-0cde05e1-3a06-4469-bf6c-20146cbcc89f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received unexpected event network-vif-unplugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 for instance with vm_state active and task_state resize_migrating.
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.131 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.206 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.207 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.278 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.283 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.339 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.339 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.378 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.381 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.421 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.422 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:16 compute-0 sshd-session[210054]: Accepted publickey for nova from 192.168.122.101 port 35634 ssh2: ECDSA SHA256:liNwzzNsv7m7uKbs0CZWtaLfd+C4L5fwY2+6uBHzm7c
Feb 19 19:28:16 compute-0 systemd-logind[822]: New session 37 of user nova.
Feb 19 19:28:16 compute-0 systemd[1]: Started Session 37 of User nova.
Feb 19 19:28:16 compute-0 sshd-session[210054]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.479 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.484 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.535 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.536 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.595 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.721 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.722 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.734 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.735 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5234MB free_disk=72.86074829101562GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.735 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:16 compute-0 nova_compute[186662]: 2026-02-19 19:28:16.735 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:16 compute-0 sshd-session[210064]: Received disconnect from 192.168.122.101 port 35634:11: disconnected by user
Feb 19 19:28:16 compute-0 sshd-session[210064]: Disconnected from user nova 192.168.122.101 port 35634
Feb 19 19:28:16 compute-0 sshd-session[210054]: pam_unix(sshd:session): session closed for user nova
Feb 19 19:28:16 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Feb 19 19:28:16 compute-0 systemd-logind[822]: Session 37 logged out. Waiting for processes to exit.
Feb 19 19:28:16 compute-0 systemd-logind[822]: Removed session 37.
Feb 19 19:28:16 compute-0 sshd-session[210075]: Accepted publickey for nova from 192.168.122.101 port 35638 ssh2: ECDSA SHA256:liNwzzNsv7m7uKbs0CZWtaLfd+C4L5fwY2+6uBHzm7c
Feb 19 19:28:16 compute-0 systemd-logind[822]: New session 38 of user nova.
Feb 19 19:28:16 compute-0 systemd[1]: Started Session 38 of User nova.
Feb 19 19:28:16 compute-0 sshd-session[210075]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 19 19:28:17 compute-0 sshd-session[210078]: Received disconnect from 192.168.122.101 port 35638:11: disconnected by user
Feb 19 19:28:17 compute-0 sshd-session[210078]: Disconnected from user nova 192.168.122.101 port 35638
Feb 19 19:28:17 compute-0 sshd-session[210075]: pam_unix(sshd:session): session closed for user nova
Feb 19 19:28:17 compute-0 systemd[1]: session-38.scope: Deactivated successfully.
Feb 19 19:28:17 compute-0 systemd-logind[822]: Session 38 logged out. Waiting for processes to exit.
Feb 19 19:28:17 compute-0 systemd-logind[822]: Removed session 38.
Feb 19 19:28:17 compute-0 sshd-session[210080]: Accepted publickey for nova from 192.168.122.101 port 35648 ssh2: ECDSA SHA256:liNwzzNsv7m7uKbs0CZWtaLfd+C4L5fwY2+6uBHzm7c
Feb 19 19:28:17 compute-0 systemd-logind[822]: New session 39 of user nova.
Feb 19 19:28:17 compute-0 systemd[1]: Started Session 39 of User nova.
Feb 19 19:28:17 compute-0 sshd-session[210080]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 19 19:28:17 compute-0 nova_compute[186662]: 2026-02-19 19:28:17.269 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:17 compute-0 sshd-session[210083]: Received disconnect from 192.168.122.101 port 35648:11: disconnected by user
Feb 19 19:28:17 compute-0 sshd-session[210083]: Disconnected from user nova 192.168.122.101 port 35648
Feb 19 19:28:17 compute-0 sshd-session[210080]: pam_unix(sshd:session): session closed for user nova
Feb 19 19:28:17 compute-0 systemd[1]: session-39.scope: Deactivated successfully.
Feb 19 19:28:17 compute-0 systemd-logind[822]: Session 39 logged out. Waiting for processes to exit.
Feb 19 19:28:17 compute-0 systemd-logind[822]: Removed session 39.
Feb 19 19:28:17 compute-0 nova_compute[186662]: 2026-02-19 19:28:17.768 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Migration for instance 7874c50f-bdd7-48d9-bbab-6db09f173178 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Feb 19 19:28:17 compute-0 nova_compute[186662]: 2026-02-19 19:28:17.768 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Migration for instance 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Feb 19 19:28:17 compute-0 nova_compute[186662]: 2026-02-19 19:28:17.783 186666 DEBUG nova.compute.manager [req-5574d117-3105-4ddc-a684-d9845f0ca4be req-889c57b6-5d01-45dd-90e8-449ecd4b3a95 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received event network-vif-unplugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:28:17 compute-0 nova_compute[186662]: 2026-02-19 19:28:17.784 186666 DEBUG oslo_concurrency.lockutils [req-5574d117-3105-4ddc-a684-d9845f0ca4be req-889c57b6-5d01-45dd-90e8-449ecd4b3a95 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:17 compute-0 nova_compute[186662]: 2026-02-19 19:28:17.785 186666 DEBUG oslo_concurrency.lockutils [req-5574d117-3105-4ddc-a684-d9845f0ca4be req-889c57b6-5d01-45dd-90e8-449ecd4b3a95 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:17 compute-0 nova_compute[186662]: 2026-02-19 19:28:17.785 186666 DEBUG oslo_concurrency.lockutils [req-5574d117-3105-4ddc-a684-d9845f0ca4be req-889c57b6-5d01-45dd-90e8-449ecd4b3a95 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:17 compute-0 nova_compute[186662]: 2026-02-19 19:28:17.785 186666 DEBUG nova.compute.manager [req-5574d117-3105-4ddc-a684-d9845f0ca4be req-889c57b6-5d01-45dd-90e8-449ecd4b3a95 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] No waiting events found dispatching network-vif-unplugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:28:17 compute-0 nova_compute[186662]: 2026-02-19 19:28:17.786 186666 WARNING nova.compute.manager [req-5574d117-3105-4ddc-a684-d9845f0ca4be req-889c57b6-5d01-45dd-90e8-449ecd4b3a95 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received unexpected event network-vif-unplugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 for instance with vm_state active and task_state resize_migrating.
Feb 19 19:28:18 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 19 19:28:18 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 19 19:28:18 compute-0 nova_compute[186662]: 2026-02-19 19:28:18.785 186666 INFO nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Updating resource usage from migration af5a8dd1-9a2a-4c65-81b2-390e5ccd83aa
Feb 19 19:28:18 compute-0 nova_compute[186662]: 2026-02-19 19:28:18.786 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Starting to track incoming migration af5a8dd1-9a2a-4c65-81b2-390e5ccd83aa with flavor 3881472c-99fb-4fe5-ab4d-bf6223e45537 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Feb 19 19:28:18 compute-0 NetworkManager[56519]: <info>  [1771529298.8738] manager: (tap73ee194b-90): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Feb 19 19:28:18 compute-0 kernel: tap73ee194b-90: entered promiscuous mode
Feb 19 19:28:18 compute-0 ovn_controller[96653]: 2026-02-19T19:28:18Z|00066|binding|INFO|Claiming lport 73ee194b-90a5-4834-81db-341af15d3ad4 for this additional chassis.
Feb 19 19:28:18 compute-0 ovn_controller[96653]: 2026-02-19T19:28:18Z|00067|binding|INFO|73ee194b-90a5-4834-81db-341af15d3ad4: Claiming fa:16:3e:41:d2:d7 10.100.0.5
Feb 19 19:28:18 compute-0 nova_compute[186662]: 2026-02-19 19:28:18.914 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:18 compute-0 systemd-udevd[210118]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:28:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:18.920 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:d2:d7 10.100.0.5'], port_security=['fa:16:3e:41:d2:d7 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7874c50f-bdd7-48d9-bbab-6db09f173178', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '10', 'neutron:security_group_ids': '286c3adc-3464-487a-ae0e-8231188cef8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46bbe139-eec8-4de4-bfdc-b507967fb452, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[], logical_port=73ee194b-90a5-4834-81db-341af15d3ad4) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:28:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:18.921 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 73ee194b-90a5-4834-81db-341af15d3ad4 in datapath 23744514-9581-483b-ba8d-38106bcd89ef unbound from our chassis
Feb 19 19:28:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:18.923 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 23744514-9581-483b-ba8d-38106bcd89ef
Feb 19 19:28:18 compute-0 ovn_controller[96653]: 2026-02-19T19:28:18Z|00068|binding|INFO|Setting lport 73ee194b-90a5-4834-81db-341af15d3ad4 ovn-installed in OVS
Feb 19 19:28:18 compute-0 nova_compute[186662]: 2026-02-19 19:28:18.924 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:18 compute-0 nova_compute[186662]: 2026-02-19 19:28:18.934 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:18 compute-0 nova_compute[186662]: 2026-02-19 19:28:18.936 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:18 compute-0 NetworkManager[56519]: <info>  [1771529298.9379] device (tap73ee194b-90): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:28:18 compute-0 NetworkManager[56519]: <info>  [1771529298.9387] device (tap73ee194b-90): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:28:18 compute-0 systemd-machined[156014]: New machine qemu-6-instance-00000006.
Feb 19 19:28:18 compute-0 nova_compute[186662]: 2026-02-19 19:28:18.956 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:18.961 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7c4d05df-b1ba-4ddc-807f-2dcf810e3ecd]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:18 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Feb 19 19:28:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:19.000 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[b479cf73-625b-4ed8-b1eb-fcf40d0aba48]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:19.003 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a86413-f712-4a16-9234-4729846be8d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:19.035 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[3c5c91d4-145a-45aa-ba4d-2a031ab8e62a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:19.055 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[5cd086a5-c06d-47fe-a4c5-4284a1d33a2f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23744514-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:f0:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 11, 'rx_bytes': 868, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 11, 'rx_bytes': 868, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341570, 'reachable_time': 39578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210135, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:19.071 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ffb92b-5893-44bc-ba75-d99c309f735c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341576, 'tstamp': 341576}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210136, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341578, 'tstamp': 341578}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210136, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:19.073 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23744514-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:19 compute-0 nova_compute[186662]: 2026-02-19 19:28:19.074 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:19 compute-0 nova_compute[186662]: 2026-02-19 19:28:19.075 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:19.076 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23744514-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:19.076 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:28:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:19.077 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap23744514-90, col_values=(('external_ids', {'iface-id': '25b16785-ed71-4ba6-a91b-c4dcee5ff875'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:19.077 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:28:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:19.079 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ef3de17e-54ac-4d5e-934e-55100d586a51]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-23744514-9581-483b-ba8d-38106bcd89ef\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 23744514-9581-483b-ba8d-38106bcd89ef\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:19 compute-0 nova_compute[186662]: 2026-02-19 19:28:19.291 186666 INFO nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Updating resource usage from migration ec57da9f-7261-472d-acdf-7cca20c774a7
Feb 19 19:28:19 compute-0 nova_compute[186662]: 2026-02-19 19:28:19.291 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Starting to track incoming migration ec57da9f-7261-472d-acdf-7cca20c774a7 with flavor 3881472c-99fb-4fe5-ab4d-bf6223e45537 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Feb 19 19:28:19 compute-0 nova_compute[186662]: 2026-02-19 19:28:19.330 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 72cc675e-4d5d-48c5-8c12-f9a42e168294 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:28:19 compute-0 nova_compute[186662]: 2026-02-19 19:28:19.331 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance df77e346-76b1-4b06-8611-44d3ac9fc3ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:28:19 compute-0 nova_compute[186662]: 2026-02-19 19:28:19.331 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 9176d9ab-648d-4303-8d54-b208a7e1d395 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:28:19 compute-0 nova_compute[186662]: 2026-02-19 19:28:19.331 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance bff022af-564c-46f6-8756-17ea48c6fdc5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:28:19 compute-0 sshd-session[210085]: Invalid user albi from 182.75.216.74 port 32035
Feb 19 19:28:19 compute-0 nova_compute[186662]: 2026-02-19 19:28:19.692 186666 WARNING neutronclient.v2_0.client [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:19 compute-0 sshd-session[210085]: Received disconnect from 182.75.216.74 port 32035:11: Bye Bye [preauth]
Feb 19 19:28:19 compute-0 sshd-session[210085]: Disconnected from invalid user albi 182.75.216.74 port 32035 [preauth]
Feb 19 19:28:19 compute-0 nova_compute[186662]: 2026-02-19 19:28:19.838 186666 WARNING nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 7874c50f-bdd7-48d9-bbab-6db09f173178 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Feb 19 19:28:20 compute-0 nova_compute[186662]: 2026-02-19 19:28:20.323 186666 INFO nova.network.neutron [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Updating port 8c60286e-6006-41af-ab4b-b58aa08d3ee2 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Feb 19 19:28:20 compute-0 nova_compute[186662]: 2026-02-19 19:28:20.345 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Instance with task_state "resize_migrated" is not being actively managed by this compute host but has allocations referencing this compute node (11ecaf50-b8a2-48b5-a41c-a8b0b10798d6): {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocations during the task state transition. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1813
Feb 19 19:28:20 compute-0 nova_compute[186662]: 2026-02-19 19:28:20.346 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 6 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:28:20 compute-0 nova_compute[186662]: 2026-02-19 19:28:20.346 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1344MB phys_disk=79GB used_disk=6GB total_vcpus=8 used_vcpus=6 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:28:16 up 59 min,  0 user,  load average: 0.63, 0.39, 0.39\n', 'num_instances': '4', 'num_vm_active': '4', 'num_task_None': '4', 'num_os_type_None': '4', 'num_proj_84d22a8926d9401eb98cf092c0899a62': '4', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:28:20 compute-0 nova_compute[186662]: 2026-02-19 19:28:20.486 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:28:20 compute-0 nova_compute[186662]: 2026-02-19 19:28:20.994 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:28:21 compute-0 nova_compute[186662]: 2026-02-19 19:28:21.132 186666 DEBUG oslo_concurrency.lockutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-9dfd1282-8be2-45bd-ad6c-5b8c9be761bc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:28:21 compute-0 nova_compute[186662]: 2026-02-19 19:28:21.132 186666 DEBUG oslo_concurrency.lockutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-9dfd1282-8be2-45bd-ad6c-5b8c9be761bc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:28:21 compute-0 nova_compute[186662]: 2026-02-19 19:28:21.132 186666 DEBUG nova.network.neutron [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:28:21 compute-0 nova_compute[186662]: 2026-02-19 19:28:21.176 186666 DEBUG nova.compute.manager [req-9b8e4d2a-1114-4550-989c-9a4b0aba1ac0 req-880baf1e-ec38-400d-80d9-0c1209fd8dda 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received event network-changed-8c60286e-6006-41af-ab4b-b58aa08d3ee2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:28:21 compute-0 nova_compute[186662]: 2026-02-19 19:28:21.177 186666 DEBUG nova.compute.manager [req-9b8e4d2a-1114-4550-989c-9a4b0aba1ac0 req-880baf1e-ec38-400d-80d9-0c1209fd8dda 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Refreshing instance network info cache due to event network-changed-8c60286e-6006-41af-ab4b-b58aa08d3ee2. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:28:21 compute-0 nova_compute[186662]: 2026-02-19 19:28:21.177 186666 DEBUG oslo_concurrency.lockutils [req-9b8e4d2a-1114-4550-989c-9a4b0aba1ac0 req-880baf1e-ec38-400d-80d9-0c1209fd8dda 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-9dfd1282-8be2-45bd-ad6c-5b8c9be761bc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:28:21 compute-0 nova_compute[186662]: 2026-02-19 19:28:21.502 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:28:21 compute-0 nova_compute[186662]: 2026-02-19 19:28:21.503 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.767s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:21 compute-0 ovn_controller[96653]: 2026-02-19T19:28:21Z|00069|binding|INFO|Claiming lport 73ee194b-90a5-4834-81db-341af15d3ad4 for this chassis.
Feb 19 19:28:21 compute-0 ovn_controller[96653]: 2026-02-19T19:28:21Z|00070|binding|INFO|73ee194b-90a5-4834-81db-341af15d3ad4: Claiming fa:16:3e:41:d2:d7 10.100.0.5
Feb 19 19:28:21 compute-0 ovn_controller[96653]: 2026-02-19T19:28:21Z|00071|binding|INFO|Setting lport 73ee194b-90a5-4834-81db-341af15d3ad4 up in Southbound
Feb 19 19:28:21 compute-0 nova_compute[186662]: 2026-02-19 19:28:21.638 186666 WARNING neutronclient.v2_0.client [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:22 compute-0 nova_compute[186662]: 2026-02-19 19:28:22.271 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:22 compute-0 nova_compute[186662]: 2026-02-19 19:28:22.643 186666 INFO nova.compute.manager [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Post operation of migration started
Feb 19 19:28:22 compute-0 nova_compute[186662]: 2026-02-19 19:28:22.643 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:22 compute-0 nova_compute[186662]: 2026-02-19 19:28:22.854 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:22 compute-0 nova_compute[186662]: 2026-02-19 19:28:22.854 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:23 compute-0 nova_compute[186662]: 2026-02-19 19:28:23.473 186666 WARNING neutronclient.v2_0.client [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:23 compute-0 nova_compute[186662]: 2026-02-19 19:28:23.499 186666 DEBUG oslo_concurrency.lockutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-7874c50f-bdd7-48d9-bbab-6db09f173178" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:28:23 compute-0 nova_compute[186662]: 2026-02-19 19:28:23.500 186666 DEBUG oslo_concurrency.lockutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-7874c50f-bdd7-48d9-bbab-6db09f173178" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:28:23 compute-0 nova_compute[186662]: 2026-02-19 19:28:23.500 186666 DEBUG nova.network.neutron [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:28:23 compute-0 nova_compute[186662]: 2026-02-19 19:28:23.658 186666 DEBUG nova.network.neutron [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Updating instance_info_cache with network_info: [{"id": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "address": "fa:16:3e:89:de:56", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c60286e-60", "ovs_interfaceid": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:28:23 compute-0 nova_compute[186662]: 2026-02-19 19:28:23.958 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:24 compute-0 nova_compute[186662]: 2026-02-19 19:28:24.006 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:24 compute-0 nova_compute[186662]: 2026-02-19 19:28:24.164 186666 DEBUG oslo_concurrency.lockutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-9dfd1282-8be2-45bd-ad6c-5b8c9be761bc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:28:24 compute-0 nova_compute[186662]: 2026-02-19 19:28:24.167 186666 DEBUG oslo_concurrency.lockutils [req-9b8e4d2a-1114-4550-989c-9a4b0aba1ac0 req-880baf1e-ec38-400d-80d9-0c1209fd8dda 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-9dfd1282-8be2-45bd-ad6c-5b8c9be761bc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:28:24 compute-0 nova_compute[186662]: 2026-02-19 19:28:24.167 186666 DEBUG nova.network.neutron [req-9b8e4d2a-1114-4550-989c-9a4b0aba1ac0 req-880baf1e-ec38-400d-80d9-0c1209fd8dda 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Refreshing network info cache for port 8c60286e-6006-41af-ab4b-b58aa08d3ee2 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:28:24 compute-0 nova_compute[186662]: 2026-02-19 19:28:24.674 186666 WARNING neutronclient.v2_0.client [req-9b8e4d2a-1114-4550-989c-9a4b0aba1ac0 req-880baf1e-ec38-400d-80d9-0c1209fd8dda 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:24 compute-0 nova_compute[186662]: 2026-02-19 19:28:24.705 186666 DEBUG nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Feb 19 19:28:24 compute-0 nova_compute[186662]: 2026-02-19 19:28:24.708 186666 DEBUG nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Feb 19 19:28:24 compute-0 nova_compute[186662]: 2026-02-19 19:28:24.708 186666 INFO nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Creating image(s)
Feb 19 19:28:24 compute-0 nova_compute[186662]: 2026-02-19 19:28:24.711 186666 DEBUG nova.objects.instance [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:28:24 compute-0 nova_compute[186662]: 2026-02-19 19:28:24.840 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.035 186666 DEBUG nova.network.neutron [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Updating instance_info_cache with network_info: [{"id": "73ee194b-90a5-4834-81db-341af15d3ad4", "address": "fa:16:3e:41:d2:d7", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ee194b-90", "ovs_interfaceid": "73ee194b-90a5-4834-81db-341af15d3ad4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.097 186666 WARNING neutronclient.v2_0.client [req-9b8e4d2a-1114-4550-989c-9a4b0aba1ac0 req-880baf1e-ec38-400d-80d9-0c1209fd8dda 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.217 186666 DEBUG oslo_concurrency.processutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.272 186666 DEBUG nova.network.neutron [req-9b8e4d2a-1114-4550-989c-9a4b0aba1ac0 req-880baf1e-ec38-400d-80d9-0c1209fd8dda 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Updated VIF entry in instance network info cache for port 8c60286e-6006-41af-ab4b-b58aa08d3ee2. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.273 186666 DEBUG nova.network.neutron [req-9b8e4d2a-1114-4550-989c-9a4b0aba1ac0 req-880baf1e-ec38-400d-80d9-0c1209fd8dda 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Updating instance_info_cache with network_info: [{"id": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "address": "fa:16:3e:89:de:56", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c60286e-60", "ovs_interfaceid": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.275 186666 DEBUG oslo_concurrency.processutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.275 186666 DEBUG nova.virt.disk.api [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Checking if we can resize image /var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.276 186666 DEBUG oslo_concurrency.processutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.320 186666 DEBUG oslo_concurrency.processutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.320 186666 DEBUG nova.virt.disk.api [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Cannot resize image /var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.541 186666 DEBUG oslo_concurrency.lockutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-7874c50f-bdd7-48d9-bbab-6db09f173178" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.781 186666 DEBUG oslo_concurrency.lockutils [req-9b8e4d2a-1114-4550-989c-9a4b0aba1ac0 req-880baf1e-ec38-400d-80d9-0c1209fd8dda 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-9dfd1282-8be2-45bd-ad6c-5b8c9be761bc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.825 186666 DEBUG nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.825 186666 DEBUG nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Ensure instance console log exists: /var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.826 186666 DEBUG oslo_concurrency.lockutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.826 186666 DEBUG oslo_concurrency.lockutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.826 186666 DEBUG oslo_concurrency.lockutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.828 186666 DEBUG nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Start _get_guest_xml network_info=[{"id": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "address": "fa:16:3e:89:de:56", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "vif_mac": "fa:16:3e:89:de:56"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c60286e-60", "ovs_interfaceid": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.831 186666 WARNING nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.832 186666 DEBUG nova.virt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1530972167', uuid='9dfd1282-8be2-45bd-ad6c-5b8c9be761bc'), owner=OwnerMeta(userid='af924faf672a45b4b8708466af6eeb12', username='tempest-TestExecuteActionsViaActuator-1567565925-project-admin', projectid='84d22a8926d9401eb98cf092c0899a62', projectname='tempest-TestExecuteActionsViaActuator-1567565925'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "address": "fa:16:3e:89:de:56", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "vif_mac": "fa:16:3e:89:de:56"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c60286e-60", "ovs_interfaceid": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771529305.8327305) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.835 186666 DEBUG nova.virt.libvirt.host [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.835 186666 DEBUG nova.virt.libvirt.host [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.838 186666 DEBUG nova.virt.libvirt.host [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.838 186666 DEBUG nova.virt.libvirt.host [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.839 186666 DEBUG nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.839 186666 DEBUG nova.virt.hardware [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:33Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.839 186666 DEBUG nova.virt.hardware [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.840 186666 DEBUG nova.virt.hardware [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.840 186666 DEBUG nova.virt.hardware [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.840 186666 DEBUG nova.virt.hardware [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.840 186666 DEBUG nova.virt.hardware [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.840 186666 DEBUG nova.virt.hardware [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.841 186666 DEBUG nova.virt.hardware [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.841 186666 DEBUG nova.virt.hardware [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.841 186666 DEBUG nova.virt.hardware [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.841 186666 DEBUG nova.virt.hardware [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:28:25 compute-0 nova_compute[186662]: 2026-02-19 19:28:25.841 186666 DEBUG nova.objects.instance [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.057 186666 DEBUG oslo_concurrency.lockutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.057 186666 DEBUG oslo_concurrency.lockutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.058 186666 DEBUG oslo_concurrency.lockutils [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.061 186666 INFO nova.virt.libvirt.driver [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 19 19:28:26 compute-0 virtqemud[186157]: Domain id=6 name='instance-00000006' uuid=7874c50f-bdd7-48d9-bbab-6db09f173178 is tainted: custom-monitor
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.346 186666 DEBUG nova.objects.base [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<9dfd1282-8be2-45bd-ad6c-5b8c9be761bc> lazy-loaded attributes: trusted_certs,vcpu_model wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.349 186666 DEBUG oslo_concurrency.processutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.389 186666 DEBUG oslo_concurrency.processutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc/disk.config --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.390 186666 DEBUG oslo_concurrency.lockutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "/var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.391 186666 DEBUG oslo_concurrency.lockutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "/var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.392 186666 DEBUG oslo_concurrency.lockutils [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "/var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.394 186666 DEBUG nova.virt.libvirt.vif [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:27:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1530972167',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1530972167',id=8,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:27:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-sc65cpv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:28:18Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=9dfd1282-8be2-45bd-ad6c-5b8c9be761bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "address": "fa:16:3e:89:de:56", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "vif_mac": "fa:16:3e:89:de:56"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c60286e-60", "ovs_interfaceid": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.394 186666 DEBUG nova.network.os_vif_util [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "address": "fa:16:3e:89:de:56", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "vif_mac": "fa:16:3e:89:de:56"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c60286e-60", "ovs_interfaceid": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.396 186666 DEBUG nova.network.os_vif_util [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:de:56,bridge_name='br-int',has_traffic_filtering=True,id=8c60286e-6006-41af-ab4b-b58aa08d3ee2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c60286e-60') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.400 186666 DEBUG nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:28:26 compute-0 nova_compute[186662]:   <uuid>9dfd1282-8be2-45bd-ad6c-5b8c9be761bc</uuid>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   <name>instance-00000008</name>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   <memory>131072</memory>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1530972167</nova:name>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:28:25</nova:creationTime>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:28:26 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:28:26 compute-0 nova_compute[186662]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Feb 19 19:28:26 compute-0 nova_compute[186662]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Feb 19 19:28:26 compute-0 nova_compute[186662]:           <nova:property name="hw_input_bus">usb</nova:property>
Feb 19 19:28:26 compute-0 nova_compute[186662]:           <nova:property name="hw_machine_type">q35</nova:property>
Feb 19 19:28:26 compute-0 nova_compute[186662]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Feb 19 19:28:26 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:28:26 compute-0 nova_compute[186662]:           <nova:property name="hw_video_model">virtio</nova:property>
Feb 19 19:28:26 compute-0 nova_compute[186662]:           <nova:property name="hw_vif_model">virtio</nova:property>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:user uuid="af924faf672a45b4b8708466af6eeb12">tempest-TestExecuteActionsViaActuator-1567565925-project-admin</nova:user>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:project uuid="84d22a8926d9401eb98cf092c0899a62">tempest-TestExecuteActionsViaActuator-1567565925</nova:project>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         <nova:port uuid="8c60286e-6006-41af-ab4b-b58aa08d3ee2">
Feb 19 19:28:26 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <system>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <entry name="serial">9dfd1282-8be2-45bd-ad6c-5b8c9be761bc</entry>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <entry name="uuid">9dfd1282-8be2-45bd-ad6c-5b8c9be761bc</entry>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     </system>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   <os>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   </os>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   <features>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   </features>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc/disk"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc/disk.config"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:89:de:56"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <target dev="tap8c60286e-60"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc/console.log" append="off"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <video>
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     </video>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:28:26 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:28:26 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:28:26 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:28:26 compute-0 nova_compute[186662]: </domain>
Feb 19 19:28:26 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.402 186666 DEBUG nova.virt.libvirt.vif [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:27:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1530972167',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1530972167',id=8,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:27:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-sc65cpv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:28:18Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=9dfd1282-8be2-45bd-ad6c-5b8c9be761bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "address": "fa:16:3e:89:de:56", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "vif_mac": "fa:16:3e:89:de:56"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c60286e-60", "ovs_interfaceid": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.403 186666 DEBUG nova.network.os_vif_util [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "address": "fa:16:3e:89:de:56", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "vif_mac": "fa:16:3e:89:de:56"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c60286e-60", "ovs_interfaceid": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.404 186666 DEBUG nova.network.os_vif_util [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:de:56,bridge_name='br-int',has_traffic_filtering=True,id=8c60286e-6006-41af-ab4b-b58aa08d3ee2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c60286e-60') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.405 186666 DEBUG os_vif [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:de:56,bridge_name='br-int',has_traffic_filtering=True,id=8c60286e-6006-41af-ab4b-b58aa08d3ee2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c60286e-60') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.406 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.406 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.407 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.408 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.408 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '82ba480a-b861-56d3-a7ec-cb545e1461b1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.430 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.431 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.433 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.433 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c60286e-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.434 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap8c60286e-60, col_values=(('qos', UUID('82725af1-47c6-4024-b9fb-6d1b354d0686')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.434 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap8c60286e-60, col_values=(('external_ids', {'iface-id': '8c60286e-6006-41af-ab4b-b58aa08d3ee2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:de:56', 'vm-uuid': '9dfd1282-8be2-45bd-ad6c-5b8c9be761bc'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.435 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:26 compute-0 NetworkManager[56519]: <info>  [1771529306.4359] manager: (tap8c60286e-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.437 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.439 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:26 compute-0 nova_compute[186662]: 2026-02-19 19:28:26.440 186666 INFO os_vif [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:de:56,bridge_name='br-int',has_traffic_filtering=True,id=8c60286e-6006-41af-ab4b-b58aa08d3ee2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c60286e-60')
Feb 19 19:28:27 compute-0 nova_compute[186662]: 2026-02-19 19:28:27.066 186666 INFO nova.virt.libvirt.driver [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 19 19:28:27 compute-0 nova_compute[186662]: 2026-02-19 19:28:27.273 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:27 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 19 19:28:27 compute-0 systemd[210001]: Activating special unit Exit the Session...
Feb 19 19:28:27 compute-0 systemd[210001]: Stopped target Main User Target.
Feb 19 19:28:27 compute-0 systemd[210001]: Stopped target Basic System.
Feb 19 19:28:27 compute-0 systemd[210001]: Stopped target Paths.
Feb 19 19:28:27 compute-0 systemd[210001]: Stopped target Sockets.
Feb 19 19:28:27 compute-0 systemd[210001]: Stopped target Timers.
Feb 19 19:28:27 compute-0 systemd[210001]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 19 19:28:27 compute-0 systemd[210001]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 19 19:28:27 compute-0 systemd[210001]: Closed D-Bus User Message Bus Socket.
Feb 19 19:28:27 compute-0 systemd[210001]: Stopped Create User's Volatile Files and Directories.
Feb 19 19:28:27 compute-0 systemd[210001]: Removed slice User Application Slice.
Feb 19 19:28:27 compute-0 systemd[210001]: Reached target Shutdown.
Feb 19 19:28:27 compute-0 systemd[210001]: Finished Exit the Session.
Feb 19 19:28:27 compute-0 systemd[210001]: Reached target Exit the Session.
Feb 19 19:28:27 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 19 19:28:27 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 19 19:28:27 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 19 19:28:27 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 19 19:28:27 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 19 19:28:27 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 19 19:28:27 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 19 19:28:27 compute-0 podman[210183]: 2026-02-19 19:28:27.540556408 +0000 UTC m=+0.052076770 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 19 19:28:27 compute-0 nova_compute[186662]: 2026-02-19 19:28:27.975 186666 DEBUG nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:28:27 compute-0 nova_compute[186662]: 2026-02-19 19:28:27.976 186666 DEBUG nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:28:27 compute-0 nova_compute[186662]: 2026-02-19 19:28:27.976 186666 DEBUG nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No VIF found with MAC fa:16:3e:89:de:56, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:28:27 compute-0 nova_compute[186662]: 2026-02-19 19:28:27.977 186666 INFO nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Using config drive
Feb 19 19:28:28 compute-0 kernel: tap8c60286e-60: entered promiscuous mode
Feb 19 19:28:28 compute-0 NetworkManager[56519]: <info>  [1771529308.0236] manager: (tap8c60286e-60): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.025 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:28 compute-0 ovn_controller[96653]: 2026-02-19T19:28:28Z|00072|binding|INFO|Claiming lport 8c60286e-6006-41af-ab4b-b58aa08d3ee2 for this chassis.
Feb 19 19:28:28 compute-0 ovn_controller[96653]: 2026-02-19T19:28:28Z|00073|binding|INFO|8c60286e-6006-41af-ab4b-b58aa08d3ee2: Claiming fa:16:3e:89:de:56 10.100.0.6
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.036 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:de:56 10.100.0.6'], port_security=['fa:16:3e:89:de:56 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9dfd1282-8be2-45bd-ad6c-5b8c9be761bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '9', 'neutron:security_group_ids': '286c3adc-3464-487a-ae0e-8231188cef8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46bbe139-eec8-4de4-bfdc-b507967fb452, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=8c60286e-6006-41af-ab4b-b58aa08d3ee2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.037 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 8c60286e-6006-41af-ab4b-b58aa08d3ee2 in datapath 23744514-9581-483b-ba8d-38106bcd89ef bound to our chassis
Feb 19 19:28:28 compute-0 ovn_controller[96653]: 2026-02-19T19:28:28Z|00074|binding|INFO|Setting lport 8c60286e-6006-41af-ab4b-b58aa08d3ee2 ovn-installed in OVS
Feb 19 19:28:28 compute-0 ovn_controller[96653]: 2026-02-19T19:28:28Z|00075|binding|INFO|Setting lport 8c60286e-6006-41af-ab4b-b58aa08d3ee2 up in Southbound
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.038 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 23744514-9581-483b-ba8d-38106bcd89ef
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.040 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.044 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:28 compute-0 systemd-udevd[210221]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.049 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[11e017a6-9428-4e9b-ad75-aa74554864e7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:28 compute-0 systemd-machined[156014]: New machine qemu-7-instance-00000008.
Feb 19 19:28:28 compute-0 NetworkManager[56519]: <info>  [1771529308.0590] device (tap8c60286e-60): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:28:28 compute-0 NetworkManager[56519]: <info>  [1771529308.0598] device (tap8c60286e-60): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:28:28 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000008.
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.070 186666 INFO nova.virt.libvirt.driver [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.076 186666 DEBUG nova.compute.manager [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.083 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[c129f918-9409-4c8b-90fb-4345bcbddef0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.085 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[3148073c-7227-4bba-878b-b2513a1f5edc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.101 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[dfa0d7b3-09f3-4752-ab3e-f5fc79f647ed]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.113 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d8979763-b237-465a-9820-ebd76ae84d89]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23744514-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:f0:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 13, 'rx_bytes': 1540, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 13, 'rx_bytes': 1540, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341570, 'reachable_time': 39578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210233, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.123 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[64ac7292-1eee-430b-955e-ba85fb1704bb]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341576, 'tstamp': 341576}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210234, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341578, 'tstamp': 341578}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210234, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.124 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23744514-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.126 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.127 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.127 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23744514-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.127 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.128 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap23744514-90, col_values=(('external_ids', {'iface-id': '25b16785-ed71-4ba6-a91b-c4dcee5ff875'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.128 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:28:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:28.130 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7b7e3514-e82a-4403-8851-d6f636cef11c]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-23744514-9581-483b-ba8d-38106bcd89ef\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 23744514-9581-483b-ba8d-38106bcd89ef\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.586 186666 DEBUG nova.objects.instance [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.697 186666 DEBUG nova.compute.manager [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.700 186666 INFO nova.virt.libvirt.driver [-] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Instance running successfully.
Feb 19 19:28:28 compute-0 virtqemud[186157]: argument unsupported: QEMU guest agent is not configured
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.702 186666 DEBUG nova.virt.libvirt.guest [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.702 186666 DEBUG nova.virt.libvirt.driver [None req-b5c2af83-335f-417a-ba91-2e052ba17a1d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.727 186666 DEBUG nova.compute.manager [req-b785581b-8e99-4e6c-90ca-65f2fc4b642c req-cfe2273e-6250-4314-94b7-2a8349a0bfc4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received event network-vif-plugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.728 186666 DEBUG oslo_concurrency.lockutils [req-b785581b-8e99-4e6c-90ca-65f2fc4b642c req-cfe2273e-6250-4314-94b7-2a8349a0bfc4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.728 186666 DEBUG oslo_concurrency.lockutils [req-b785581b-8e99-4e6c-90ca-65f2fc4b642c req-cfe2273e-6250-4314-94b7-2a8349a0bfc4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.728 186666 DEBUG oslo_concurrency.lockutils [req-b785581b-8e99-4e6c-90ca-65f2fc4b642c req-cfe2273e-6250-4314-94b7-2a8349a0bfc4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.728 186666 DEBUG nova.compute.manager [req-b785581b-8e99-4e6c-90ca-65f2fc4b642c req-cfe2273e-6250-4314-94b7-2a8349a0bfc4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] No waiting events found dispatching network-vif-plugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:28:28 compute-0 nova_compute[186662]: 2026-02-19 19:28:28.729 186666 WARNING nova.compute.manager [req-b785581b-8e99-4e6c-90ca-65f2fc4b642c req-cfe2273e-6250-4314-94b7-2a8349a0bfc4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received unexpected event network-vif-plugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 for instance with vm_state active and task_state resize_finish.
Feb 19 19:28:29 compute-0 nova_compute[186662]: 2026-02-19 19:28:29.605 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:29 compute-0 podman[196025]: time="2026-02-19T19:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:28:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17220 "" "Go-http-client/1.1"
Feb 19 19:28:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2644 "" "Go-http-client/1.1"
Feb 19 19:28:30 compute-0 nova_compute[186662]: 2026-02-19 19:28:30.326 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:30 compute-0 nova_compute[186662]: 2026-02-19 19:28:30.326 186666 WARNING neutronclient.v2_0.client [None req-20fb390d-75c2-447f-a592-b87910f09fb3 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:30 compute-0 nova_compute[186662]: 2026-02-19 19:28:30.828 186666 DEBUG nova.compute.manager [req-6d069e88-4a1d-403b-8c62-13edc50fc39a req-5428fd02-5550-4bb9-9ab8-d41fd9a0c2e5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received event network-vif-plugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:28:30 compute-0 nova_compute[186662]: 2026-02-19 19:28:30.828 186666 DEBUG oslo_concurrency.lockutils [req-6d069e88-4a1d-403b-8c62-13edc50fc39a req-5428fd02-5550-4bb9-9ab8-d41fd9a0c2e5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:30 compute-0 nova_compute[186662]: 2026-02-19 19:28:30.828 186666 DEBUG oslo_concurrency.lockutils [req-6d069e88-4a1d-403b-8c62-13edc50fc39a req-5428fd02-5550-4bb9-9ab8-d41fd9a0c2e5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:30 compute-0 nova_compute[186662]: 2026-02-19 19:28:30.829 186666 DEBUG oslo_concurrency.lockutils [req-6d069e88-4a1d-403b-8c62-13edc50fc39a req-5428fd02-5550-4bb9-9ab8-d41fd9a0c2e5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:30 compute-0 nova_compute[186662]: 2026-02-19 19:28:30.829 186666 DEBUG nova.compute.manager [req-6d069e88-4a1d-403b-8c62-13edc50fc39a req-5428fd02-5550-4bb9-9ab8-d41fd9a0c2e5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] No waiting events found dispatching network-vif-plugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:28:30 compute-0 nova_compute[186662]: 2026-02-19 19:28:30.829 186666 WARNING nova.compute.manager [req-6d069e88-4a1d-403b-8c62-13edc50fc39a req-5428fd02-5550-4bb9-9ab8-d41fd9a0c2e5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received unexpected event network-vif-plugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 for instance with vm_state resized and task_state None.
Feb 19 19:28:31 compute-0 podman[210244]: 2026-02-19 19:28:31.27890395 +0000 UTC m=+0.047074191 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Feb 19 19:28:31 compute-0 openstack_network_exporter[198916]: ERROR   19:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:28:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:28:31 compute-0 openstack_network_exporter[198916]: ERROR   19:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:28:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:28:31 compute-0 nova_compute[186662]: 2026-02-19 19:28:31.436 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:32.122 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:32.122 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:32.123 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:32 compute-0 nova_compute[186662]: 2026-02-19 19:28:32.278 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:33 compute-0 podman[210267]: 2026-02-19 19:28:33.286377075 +0000 UTC m=+0.065763989 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 19 19:28:36 compute-0 nova_compute[186662]: 2026-02-19 19:28:36.446 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:37 compute-0 nova_compute[186662]: 2026-02-19 19:28:37.279 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:41 compute-0 ovn_controller[96653]: 2026-02-19T19:28:41Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:89:de:56 10.100.0.6
Feb 19 19:28:41 compute-0 nova_compute[186662]: 2026-02-19 19:28:41.449 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:42 compute-0 nova_compute[186662]: 2026-02-19 19:28:42.281 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:42 compute-0 podman[210304]: 2026-02-19 19:28:42.291418583 +0000 UTC m=+0.066451456 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:28:46 compute-0 nova_compute[186662]: 2026-02-19 19:28:46.453 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:47 compute-0 nova_compute[186662]: 2026-02-19 19:28:47.283 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:51 compute-0 nova_compute[186662]: 2026-02-19 19:28:51.455 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:51 compute-0 nova_compute[186662]: 2026-02-19 19:28:51.988 186666 DEBUG oslo_concurrency.lockutils [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "bff022af-564c-46f6-8756-17ea48c6fdc5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:51 compute-0 nova_compute[186662]: 2026-02-19 19:28:51.989 186666 DEBUG oslo_concurrency.lockutils [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:51 compute-0 nova_compute[186662]: 2026-02-19 19:28:51.989 186666 DEBUG oslo_concurrency.lockutils [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:51 compute-0 nova_compute[186662]: 2026-02-19 19:28:51.990 186666 DEBUG oslo_concurrency.lockutils [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:51 compute-0 nova_compute[186662]: 2026-02-19 19:28:51.990 186666 DEBUG oslo_concurrency.lockutils [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.003 186666 INFO nova.compute.manager [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Terminating instance
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.286 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.520 186666 DEBUG nova.compute.manager [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:28:52 compute-0 kernel: tapd89a6d19-fd (unregistering): left promiscuous mode
Feb 19 19:28:52 compute-0 NetworkManager[56519]: <info>  [1771529332.5504] device (tapd89a6d19-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.556 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:52 compute-0 ovn_controller[96653]: 2026-02-19T19:28:52Z|00076|binding|INFO|Releasing lport d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 from this chassis (sb_readonly=0)
Feb 19 19:28:52 compute-0 ovn_controller[96653]: 2026-02-19T19:28:52Z|00077|binding|INFO|Setting lport d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 down in Southbound
Feb 19 19:28:52 compute-0 ovn_controller[96653]: 2026-02-19T19:28:52Z|00078|binding|INFO|Removing iface tapd89a6d19-fd ovn-installed in OVS
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.560 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.565 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.567 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:66:f9 10.100.0.10'], port_security=['fa:16:3e:9a:66:f9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bff022af-564c-46f6-8756-17ea48c6fdc5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '5', 'neutron:security_group_ids': '286c3adc-3464-487a-ae0e-8231188cef8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46bbe139-eec8-4de4-bfdc-b507967fb452, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=d89a6d19-fdc2-4b79-baf7-7f00bdc698b6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.568 105986 INFO neutron.agent.ovn.metadata.agent [-] Port d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 in datapath 23744514-9581-483b-ba8d-38106bcd89ef unbound from our chassis
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.569 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 23744514-9581-483b-ba8d-38106bcd89ef
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.580 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[528ab5ab-efea-473b-bcad-5ee4461383ab]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:52 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Deactivated successfully.
Feb 19 19:28:52 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Consumed 14.692s CPU time.
Feb 19 19:28:52 compute-0 systemd-machined[156014]: Machine qemu-5-instance-00000009 terminated.
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.603 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[05622ef7-1a15-46ea-84fb-4dd19fd1f1be]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.606 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[a07514f4-b0da-4a7d-95f8-0619f2b46466]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.627 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[c27c75fb-5084-4f7f-9ac0-1d82849b517d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.641 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b4bebb1f-4504-48b9-8d77-7eba537c8c90]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23744514-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:f0:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 15, 'rx_bytes': 1792, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 15, 'rx_bytes': 1792, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341570, 'reachable_time': 39578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210343, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.655 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e65febc8-be20-477b-9297-2aa19f496735]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341576, 'tstamp': 341576}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210344, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341578, 'tstamp': 341578}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210344, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.656 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23744514-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.658 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.664 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.664 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23744514-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.665 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.665 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap23744514-90, col_values=(('external_ids', {'iface-id': '25b16785-ed71-4ba6-a91b-c4dcee5ff875'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.666 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:28:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:52.667 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[598f7c34-9b84-4efd-8340-76a0a50b970e]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-23744514-9581-483b-ba8d-38106bcd89ef\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 23744514-9581-483b-ba8d-38106bcd89ef\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:52 compute-0 NetworkManager[56519]: <info>  [1771529332.7391] manager: (tapd89a6d19-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.740 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.744 186666 DEBUG nova.compute.manager [req-195a56be-a1f5-41f2-b1c8-d38375faf630 req-5dcda7c7-408f-46d0-9126-a60ca957ec82 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Received event network-vif-unplugged-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.745 186666 DEBUG oslo_concurrency.lockutils [req-195a56be-a1f5-41f2-b1c8-d38375faf630 req-5dcda7c7-408f-46d0-9126-a60ca957ec82 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.745 186666 DEBUG oslo_concurrency.lockutils [req-195a56be-a1f5-41f2-b1c8-d38375faf630 req-5dcda7c7-408f-46d0-9126-a60ca957ec82 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.745 186666 DEBUG oslo_concurrency.lockutils [req-195a56be-a1f5-41f2-b1c8-d38375faf630 req-5dcda7c7-408f-46d0-9126-a60ca957ec82 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.746 186666 DEBUG nova.compute.manager [req-195a56be-a1f5-41f2-b1c8-d38375faf630 req-5dcda7c7-408f-46d0-9126-a60ca957ec82 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] No waiting events found dispatching network-vif-unplugged-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.746 186666 DEBUG nova.compute.manager [req-195a56be-a1f5-41f2-b1c8-d38375faf630 req-5dcda7c7-408f-46d0-9126-a60ca957ec82 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Received event network-vif-unplugged-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.747 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.768 186666 INFO nova.virt.libvirt.driver [-] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Instance destroyed successfully.
Feb 19 19:28:52 compute-0 nova_compute[186662]: 2026-02-19 19:28:52.768 186666 DEBUG nova.objects.instance [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lazy-loading 'resources' on Instance uuid bff022af-564c-46f6-8756-17ea48c6fdc5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.278 186666 DEBUG nova.virt.libvirt.vif [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:27:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-310914281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-310914281',id=9,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:27:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-xyjvx1b2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:27:51Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=bff022af-564c-46f6-8756-17ea48c6fdc5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "address": "fa:16:3e:9a:66:f9", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd89a6d19-fd", "ovs_interfaceid": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.279 186666 DEBUG nova.network.os_vif_util [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converting VIF {"id": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "address": "fa:16:3e:9a:66:f9", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd89a6d19-fd", "ovs_interfaceid": "d89a6d19-fdc2-4b79-baf7-7f00bdc698b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.279 186666 DEBUG nova.network.os_vif_util [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:66:f9,bridge_name='br-int',has_traffic_filtering=True,id=d89a6d19-fdc2-4b79-baf7-7f00bdc698b6,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd89a6d19-fd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.280 186666 DEBUG os_vif [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:66:f9,bridge_name='br-int',has_traffic_filtering=True,id=d89a6d19-fdc2-4b79-baf7-7f00bdc698b6,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd89a6d19-fd') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.281 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.282 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd89a6d19-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.318 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.320 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.321 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.321 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=9d04e2d7-5749-47ca-bf86-5d70c2a3d684) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.321 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.323 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.324 186666 INFO os_vif [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:66:f9,bridge_name='br-int',has_traffic_filtering=True,id=d89a6d19-fdc2-4b79-baf7-7f00bdc698b6,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd89a6d19-fd')
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.325 186666 INFO nova.virt.libvirt.driver [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Deleting instance files /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5_del
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.325 186666 INFO nova.virt.libvirt.driver [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Deletion of /var/lib/nova/instances/bff022af-564c-46f6-8756-17ea48c6fdc5_del complete
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.839 186666 INFO nova.compute.manager [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Took 1.32 seconds to destroy the instance on the hypervisor.
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.840 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.841 186666 DEBUG nova.compute.manager [-] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.841 186666 DEBUG nova.network.neutron [-] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:28:53 compute-0 nova_compute[186662]: 2026-02-19 19:28:53.842 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:54 compute-0 nova_compute[186662]: 2026-02-19 19:28:54.571 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:28:54 compute-0 nova_compute[186662]: 2026-02-19 19:28:54.798 186666 DEBUG nova.compute.manager [req-e376bbd4-ada8-4d13-9c59-5d48517d472d req-9b540ea4-d49c-4f8f-be27-492e7339ca2b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Received event network-vif-unplugged-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:28:54 compute-0 nova_compute[186662]: 2026-02-19 19:28:54.798 186666 DEBUG oslo_concurrency.lockutils [req-e376bbd4-ada8-4d13-9c59-5d48517d472d req-9b540ea4-d49c-4f8f-be27-492e7339ca2b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:54 compute-0 nova_compute[186662]: 2026-02-19 19:28:54.799 186666 DEBUG oslo_concurrency.lockutils [req-e376bbd4-ada8-4d13-9c59-5d48517d472d req-9b540ea4-d49c-4f8f-be27-492e7339ca2b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:54 compute-0 nova_compute[186662]: 2026-02-19 19:28:54.799 186666 DEBUG oslo_concurrency.lockutils [req-e376bbd4-ada8-4d13-9c59-5d48517d472d req-9b540ea4-d49c-4f8f-be27-492e7339ca2b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:54 compute-0 nova_compute[186662]: 2026-02-19 19:28:54.799 186666 DEBUG nova.compute.manager [req-e376bbd4-ada8-4d13-9c59-5d48517d472d req-9b540ea4-d49c-4f8f-be27-492e7339ca2b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] No waiting events found dispatching network-vif-unplugged-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:28:54 compute-0 nova_compute[186662]: 2026-02-19 19:28:54.799 186666 DEBUG nova.compute.manager [req-e376bbd4-ada8-4d13-9c59-5d48517d472d req-9b540ea4-d49c-4f8f-be27-492e7339ca2b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Received event network-vif-unplugged-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:28:55 compute-0 nova_compute[186662]: 2026-02-19 19:28:55.341 186666 DEBUG nova.network.neutron [-] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:28:55 compute-0 nova_compute[186662]: 2026-02-19 19:28:55.847 186666 INFO nova.compute.manager [-] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Took 2.01 seconds to deallocate network for instance.
Feb 19 19:28:56 compute-0 nova_compute[186662]: 2026-02-19 19:28:56.531 186666 DEBUG oslo_concurrency.lockutils [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:56 compute-0 nova_compute[186662]: 2026-02-19 19:28:56.532 186666 DEBUG oslo_concurrency.lockutils [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:56 compute-0 nova_compute[186662]: 2026-02-19 19:28:56.671 186666 DEBUG nova.compute.provider_tree [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:28:56 compute-0 nova_compute[186662]: 2026-02-19 19:28:56.856 186666 DEBUG nova.compute.manager [req-ae6dce38-7e85-48b7-a718-f5fce04811bb req-0773b96e-3f02-4875-a073-8442dd3ebbcf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bff022af-564c-46f6-8756-17ea48c6fdc5] Received event network-vif-deleted-d89a6d19-fdc2-4b79-baf7-7f00bdc698b6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:28:57 compute-0 nova_compute[186662]: 2026-02-19 19:28:57.178 186666 DEBUG nova.scheduler.client.report [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:28:57 compute-0 nova_compute[186662]: 2026-02-19 19:28:57.329 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:57 compute-0 nova_compute[186662]: 2026-02-19 19:28:57.692 186666 DEBUG oslo_concurrency.lockutils [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.160s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:57 compute-0 nova_compute[186662]: 2026-02-19 19:28:57.719 186666 INFO nova.scheduler.client.report [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Deleted allocations for instance bff022af-564c-46f6-8756-17ea48c6fdc5
Feb 19 19:28:58 compute-0 podman[210362]: 2026-02-19 19:28:58.291588337 +0000 UTC m=+0.056482658 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Feb 19 19:28:58 compute-0 nova_compute[186662]: 2026-02-19 19:28:58.323 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:58 compute-0 nova_compute[186662]: 2026-02-19 19:28:58.747 186666 DEBUG oslo_concurrency.lockutils [None req-f245a966-d6f3-465a-816c-60c76005cb81 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "bff022af-564c-46f6-8756-17ea48c6fdc5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.758s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:59 compute-0 nova_compute[186662]: 2026-02-19 19:28:59.309 186666 DEBUG oslo_concurrency.lockutils [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:59 compute-0 nova_compute[186662]: 2026-02-19 19:28:59.310 186666 DEBUG oslo_concurrency.lockutils [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:59 compute-0 nova_compute[186662]: 2026-02-19 19:28:59.310 186666 DEBUG oslo_concurrency.lockutils [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:28:59 compute-0 nova_compute[186662]: 2026-02-19 19:28:59.310 186666 DEBUG oslo_concurrency.lockutils [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:28:59 compute-0 nova_compute[186662]: 2026-02-19 19:28:59.311 186666 DEBUG oslo_concurrency.lockutils [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:28:59 compute-0 nova_compute[186662]: 2026-02-19 19:28:59.325 186666 INFO nova.compute.manager [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Terminating instance
Feb 19 19:28:59 compute-0 podman[196025]: time="2026-02-19T19:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:28:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17220 "" "Go-http-client/1.1"
Feb 19 19:28:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2652 "" "Go-http-client/1.1"
Feb 19 19:28:59 compute-0 nova_compute[186662]: 2026-02-19 19:28:59.842 186666 DEBUG nova.compute.manager [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:28:59 compute-0 kernel: tap8c60286e-60 (unregistering): left promiscuous mode
Feb 19 19:28:59 compute-0 NetworkManager[56519]: <info>  [1771529339.8652] device (tap8c60286e-60): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:28:59 compute-0 ovn_controller[96653]: 2026-02-19T19:28:59Z|00079|binding|INFO|Releasing lport 8c60286e-6006-41af-ab4b-b58aa08d3ee2 from this chassis (sb_readonly=0)
Feb 19 19:28:59 compute-0 ovn_controller[96653]: 2026-02-19T19:28:59Z|00080|binding|INFO|Setting lport 8c60286e-6006-41af-ab4b-b58aa08d3ee2 down in Southbound
Feb 19 19:28:59 compute-0 nova_compute[186662]: 2026-02-19 19:28:59.871 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:59 compute-0 ovn_controller[96653]: 2026-02-19T19:28:59Z|00081|binding|INFO|Removing iface tap8c60286e-60 ovn-installed in OVS
Feb 19 19:28:59 compute-0 nova_compute[186662]: 2026-02-19 19:28:59.873 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.879 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:de:56 10.100.0.6'], port_security=['fa:16:3e:89:de:56 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9dfd1282-8be2-45bd-ad6c-5b8c9be761bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '10', 'neutron:security_group_ids': '286c3adc-3464-487a-ae0e-8231188cef8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46bbe139-eec8-4de4-bfdc-b507967fb452, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=8c60286e-6006-41af-ab4b-b58aa08d3ee2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:28:59 compute-0 nova_compute[186662]: 2026-02-19 19:28:59.879 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.880 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 8c60286e-6006-41af-ab4b-b58aa08d3ee2 in datapath 23744514-9581-483b-ba8d-38106bcd89ef unbound from our chassis
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.881 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 23744514-9581-483b-ba8d-38106bcd89ef
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.890 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[176880d6-75ba-4f6b-8fb9-4626c4fb25ff]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.909 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[7edeebc3-6f29-4e23-aa9a-1f75a7119c26]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.912 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[98be3c3d-2125-4adf-af2a-886228cd6706]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:59 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000008.scope: Deactivated successfully.
Feb 19 19:28:59 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000008.scope: Consumed 13.861s CPU time.
Feb 19 19:28:59 compute-0 systemd-machined[156014]: Machine qemu-7-instance-00000008 terminated.
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.936 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae7cd41-4128-4036-9ddb-e2428ad402e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.945 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e0972858-bace-4f58-b3e4-778e604edb8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23744514-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:f0:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 17, 'rx_bytes': 1792, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 17, 'rx_bytes': 1792, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341570, 'reachable_time': 39578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210393, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.955 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[56b56676-36e9-4850-a82e-9b67729304a4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341576, 'tstamp': 341576}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210394, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341578, 'tstamp': 341578}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210394, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.956 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23744514-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:59 compute-0 nova_compute[186662]: 2026-02-19 19:28:59.958 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:59 compute-0 nova_compute[186662]: 2026-02-19 19:28:59.960 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.960 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23744514-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.961 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.961 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap23744514-90, col_values=(('external_ids', {'iface-id': '25b16785-ed71-4ba6-a91b-c4dcee5ff875'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.961 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:28:59 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:28:59.962 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[8c830d57-a958-48e3-bcb1-a8735cad22f8]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-23744514-9581-483b-ba8d-38106bcd89ef\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 23744514-9581-483b-ba8d-38106bcd89ef\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.007 186666 DEBUG nova.compute.manager [req-2f1c2ff5-f0d4-417a-99ce-99e65c8af494 req-0b19aca0-877d-43d7-948d-5b9c69cfb181 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received event network-vif-unplugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.007 186666 DEBUG oslo_concurrency.lockutils [req-2f1c2ff5-f0d4-417a-99ce-99e65c8af494 req-0b19aca0-877d-43d7-948d-5b9c69cfb181 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.007 186666 DEBUG oslo_concurrency.lockutils [req-2f1c2ff5-f0d4-417a-99ce-99e65c8af494 req-0b19aca0-877d-43d7-948d-5b9c69cfb181 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.008 186666 DEBUG oslo_concurrency.lockutils [req-2f1c2ff5-f0d4-417a-99ce-99e65c8af494 req-0b19aca0-877d-43d7-948d-5b9c69cfb181 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.009 186666 DEBUG nova.compute.manager [req-2f1c2ff5-f0d4-417a-99ce-99e65c8af494 req-0b19aca0-877d-43d7-948d-5b9c69cfb181 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] No waiting events found dispatching network-vif-unplugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.009 186666 DEBUG nova.compute.manager [req-2f1c2ff5-f0d4-417a-99ce-99e65c8af494 req-0b19aca0-877d-43d7-948d-5b9c69cfb181 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received event network-vif-unplugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.090 186666 INFO nova.virt.libvirt.driver [-] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Instance destroyed successfully.
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.091 186666 DEBUG nova.objects.instance [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lazy-loading 'resources' on Instance uuid 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.596 186666 DEBUG nova.virt.libvirt.vif [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:27:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1530972167',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1530972167',id=8,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:28:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-sc65cpv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:28:46Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=9dfd1282-8be2-45bd-ad6c-5b8c9be761bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "address": "fa:16:3e:89:de:56", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c60286e-60", "ovs_interfaceid": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.596 186666 DEBUG nova.network.os_vif_util [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converting VIF {"id": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "address": "fa:16:3e:89:de:56", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c60286e-60", "ovs_interfaceid": "8c60286e-6006-41af-ab4b-b58aa08d3ee2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.596 186666 DEBUG nova.network.os_vif_util [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:89:de:56,bridge_name='br-int',has_traffic_filtering=True,id=8c60286e-6006-41af-ab4b-b58aa08d3ee2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c60286e-60') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.597 186666 DEBUG os_vif [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:de:56,bridge_name='br-int',has_traffic_filtering=True,id=8c60286e-6006-41af-ab4b-b58aa08d3ee2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c60286e-60') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.598 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.598 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c60286e-60, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.599 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.601 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.602 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.602 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=82725af1-47c6-4024-b9fb-6d1b354d0686) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.603 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.604 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.605 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.608 186666 INFO os_vif [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:de:56,bridge_name='br-int',has_traffic_filtering=True,id=8c60286e-6006-41af-ab4b-b58aa08d3ee2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c60286e-60')
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.608 186666 INFO nova.virt.libvirt.driver [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Deleting instance files /var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc_del
Feb 19 19:29:00 compute-0 nova_compute[186662]: 2026-02-19 19:29:00.610 186666 INFO nova.virt.libvirt.driver [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Deletion of /var/lib/nova/instances/9dfd1282-8be2-45bd-ad6c-5b8c9be761bc_del complete
Feb 19 19:29:01 compute-0 nova_compute[186662]: 2026-02-19 19:29:01.121 186666 INFO nova.compute.manager [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Took 1.28 seconds to destroy the instance on the hypervisor.
Feb 19 19:29:01 compute-0 nova_compute[186662]: 2026-02-19 19:29:01.121 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:29:01 compute-0 nova_compute[186662]: 2026-02-19 19:29:01.122 186666 DEBUG nova.compute.manager [-] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:29:01 compute-0 nova_compute[186662]: 2026-02-19 19:29:01.122 186666 DEBUG nova.network.neutron [-] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:29:01 compute-0 nova_compute[186662]: 2026-02-19 19:29:01.122 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:29:01 compute-0 nova_compute[186662]: 2026-02-19 19:29:01.214 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:29:01 compute-0 openstack_network_exporter[198916]: ERROR   19:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:29:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:29:01 compute-0 openstack_network_exporter[198916]: ERROR   19:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:29:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:29:01 compute-0 nova_compute[186662]: 2026-02-19 19:29:01.541 186666 DEBUG nova.compute.manager [req-4c41a6f5-377a-4e12-b427-0c2637516fcf req-6e2e5552-da23-4735-80d6-858208adc7ce 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received event network-vif-deleted-8c60286e-6006-41af-ab4b-b58aa08d3ee2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:01 compute-0 nova_compute[186662]: 2026-02-19 19:29:01.542 186666 INFO nova.compute.manager [req-4c41a6f5-377a-4e12-b427-0c2637516fcf req-6e2e5552-da23-4735-80d6-858208adc7ce 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Neutron deleted interface 8c60286e-6006-41af-ab4b-b58aa08d3ee2; detaching it from the instance and deleting it from the info cache
Feb 19 19:29:01 compute-0 nova_compute[186662]: 2026-02-19 19:29:01.542 186666 DEBUG nova.network.neutron [req-4c41a6f5-377a-4e12-b427-0c2637516fcf req-6e2e5552-da23-4735-80d6-858208adc7ce 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:29:01 compute-0 nova_compute[186662]: 2026-02-19 19:29:01.996 186666 DEBUG nova.network.neutron [-] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:29:02 compute-0 nova_compute[186662]: 2026-02-19 19:29:02.049 186666 DEBUG nova.compute.manager [req-4c41a6f5-377a-4e12-b427-0c2637516fcf req-6e2e5552-da23-4735-80d6-858208adc7ce 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Detach interface failed, port_id=8c60286e-6006-41af-ab4b-b58aa08d3ee2, reason: Instance 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:29:02 compute-0 nova_compute[186662]: 2026-02-19 19:29:02.074 186666 DEBUG nova.compute.manager [req-69355b46-dfd8-4e1e-ad85-10bf369d86eb req-3a4f7ec1-1e89-476a-b319-7bad4fd0277b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received event network-vif-unplugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:02 compute-0 nova_compute[186662]: 2026-02-19 19:29:02.074 186666 DEBUG oslo_concurrency.lockutils [req-69355b46-dfd8-4e1e-ad85-10bf369d86eb req-3a4f7ec1-1e89-476a-b319-7bad4fd0277b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:02 compute-0 nova_compute[186662]: 2026-02-19 19:29:02.075 186666 DEBUG oslo_concurrency.lockutils [req-69355b46-dfd8-4e1e-ad85-10bf369d86eb req-3a4f7ec1-1e89-476a-b319-7bad4fd0277b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:02 compute-0 nova_compute[186662]: 2026-02-19 19:29:02.075 186666 DEBUG oslo_concurrency.lockutils [req-69355b46-dfd8-4e1e-ad85-10bf369d86eb req-3a4f7ec1-1e89-476a-b319-7bad4fd0277b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:02 compute-0 nova_compute[186662]: 2026-02-19 19:29:02.075 186666 DEBUG nova.compute.manager [req-69355b46-dfd8-4e1e-ad85-10bf369d86eb req-3a4f7ec1-1e89-476a-b319-7bad4fd0277b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] No waiting events found dispatching network-vif-unplugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:29:02 compute-0 nova_compute[186662]: 2026-02-19 19:29:02.075 186666 DEBUG nova.compute.manager [req-69355b46-dfd8-4e1e-ad85-10bf369d86eb req-3a4f7ec1-1e89-476a-b319-7bad4fd0277b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Received event network-vif-unplugged-8c60286e-6006-41af-ab4b-b58aa08d3ee2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:29:02 compute-0 podman[210413]: 2026-02-19 19:29:02.32999191 +0000 UTC m=+0.091183380 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, name=ubi9/ubi-minimal, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 19 19:29:02 compute-0 nova_compute[186662]: 2026-02-19 19:29:02.332 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:02 compute-0 nova_compute[186662]: 2026-02-19 19:29:02.503 186666 INFO nova.compute.manager [-] [instance: 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc] Took 1.38 seconds to deallocate network for instance.
Feb 19 19:29:03 compute-0 nova_compute[186662]: 2026-02-19 19:29:03.018 186666 DEBUG oslo_concurrency.lockutils [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:03 compute-0 nova_compute[186662]: 2026-02-19 19:29:03.019 186666 DEBUG oslo_concurrency.lockutils [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:03 compute-0 nova_compute[186662]: 2026-02-19 19:29:03.025 186666 DEBUG oslo_concurrency.lockutils [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:03 compute-0 nova_compute[186662]: 2026-02-19 19:29:03.064 186666 INFO nova.scheduler.client.report [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Deleted allocations for instance 9dfd1282-8be2-45bd-ad6c-5b8c9be761bc
Feb 19 19:29:04 compute-0 nova_compute[186662]: 2026-02-19 19:29:04.096 186666 DEBUG oslo_concurrency.lockutils [None req-e63395cf-3ee8-4b4e-9c70-8f8c454a58fa af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "9dfd1282-8be2-45bd-ad6c-5b8c9be761bc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.786s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:04 compute-0 podman[210434]: 2026-02-19 19:29:04.310337593 +0000 UTC m=+0.074717624 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Feb 19 19:29:04 compute-0 nova_compute[186662]: 2026-02-19 19:29:04.850 186666 DEBUG oslo_concurrency.lockutils [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "9176d9ab-648d-4303-8d54-b208a7e1d395" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:04 compute-0 nova_compute[186662]: 2026-02-19 19:29:04.850 186666 DEBUG oslo_concurrency.lockutils [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:04 compute-0 nova_compute[186662]: 2026-02-19 19:29:04.851 186666 DEBUG oslo_concurrency.lockutils [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:04 compute-0 nova_compute[186662]: 2026-02-19 19:29:04.851 186666 DEBUG oslo_concurrency.lockutils [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:04 compute-0 nova_compute[186662]: 2026-02-19 19:29:04.851 186666 DEBUG oslo_concurrency.lockutils [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:04 compute-0 nova_compute[186662]: 2026-02-19 19:29:04.864 186666 INFO nova.compute.manager [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Terminating instance
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.384 186666 DEBUG nova.compute.manager [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:29:05 compute-0 kernel: tap61c38173-e5 (unregistering): left promiscuous mode
Feb 19 19:29:05 compute-0 NetworkManager[56519]: <info>  [1771529345.4065] device (tap61c38173-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:29:05 compute-0 ovn_controller[96653]: 2026-02-19T19:29:05Z|00082|binding|INFO|Releasing lport 61c38173-e58d-41f4-a6e7-aa24531b3ed2 from this chassis (sb_readonly=0)
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.416 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:05 compute-0 ovn_controller[96653]: 2026-02-19T19:29:05Z|00083|binding|INFO|Setting lport 61c38173-e58d-41f4-a6e7-aa24531b3ed2 down in Southbound
Feb 19 19:29:05 compute-0 ovn_controller[96653]: 2026-02-19T19:29:05Z|00084|binding|INFO|Removing iface tap61c38173-e5 ovn-installed in OVS
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.418 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.422 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.423 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:12:3f:ba 10.100.0.4'], port_security=['fa:16:3e:12:3f:ba 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9176d9ab-648d-4303-8d54-b208a7e1d395', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '5', 'neutron:security_group_ids': '286c3adc-3464-487a-ae0e-8231188cef8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46bbe139-eec8-4de4-bfdc-b507967fb452, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=61c38173-e58d-41f4-a6e7-aa24531b3ed2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.424 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 61c38173-e58d-41f4-a6e7-aa24531b3ed2 in datapath 23744514-9581-483b-ba8d-38106bcd89ef unbound from our chassis
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.425 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 23744514-9581-483b-ba8d-38106bcd89ef
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.437 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b088a055-79c8-47b0-ab03-1bded07409ba]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.457 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[1269213a-8a96-4442-a270-aada36e6258a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.459 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[2e1dd41c-3899-4253-82e8-41a0f638336d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:05 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Feb 19 19:29:05 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 16.242s CPU time.
Feb 19 19:29:05 compute-0 systemd-machined[156014]: Machine qemu-4-instance-00000007 terminated.
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.474 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ad8aff-4c07-4364-ad45-e83b4ec1a318]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.488 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a1aaf31d-f948-40db-92b9-3270d1d0cd3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23744514-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:f0:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 1792, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 1792, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341570, 'reachable_time': 39578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210474, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.500 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a87ed14c-a5b9-42d2-bc98-6462c4d45da4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341576, 'tstamp': 341576}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210475, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341578, 'tstamp': 341578}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210475, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.501 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23744514-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.502 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.506 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.506 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23744514-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.506 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.506 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap23744514-90, col_values=(('external_ids', {'iface-id': '25b16785-ed71-4ba6-a91b-c4dcee5ff875'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.506 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:29:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:05.507 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c424d8f9-aed6-4ba9-b2e2-c6f9902be043]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-23744514-9581-483b-ba8d-38106bcd89ef\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 23744514-9581-483b-ba8d-38106bcd89ef\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.603 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.627 186666 INFO nova.virt.libvirt.driver [-] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Instance destroyed successfully.
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.628 186666 DEBUG nova.objects.instance [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lazy-loading 'resources' on Instance uuid 9176d9ab-648d-4303-8d54-b208a7e1d395 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.634 186666 DEBUG nova.compute.manager [req-8ef2b62c-8b4d-465a-8ccd-68bf630aecb7 req-c5cb6fee-bff7-40ef-bb16-8f3fb40672b6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Received event network-vif-unplugged-61c38173-e58d-41f4-a6e7-aa24531b3ed2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.634 186666 DEBUG oslo_concurrency.lockutils [req-8ef2b62c-8b4d-465a-8ccd-68bf630aecb7 req-c5cb6fee-bff7-40ef-bb16-8f3fb40672b6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.634 186666 DEBUG oslo_concurrency.lockutils [req-8ef2b62c-8b4d-465a-8ccd-68bf630aecb7 req-c5cb6fee-bff7-40ef-bb16-8f3fb40672b6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.634 186666 DEBUG oslo_concurrency.lockutils [req-8ef2b62c-8b4d-465a-8ccd-68bf630aecb7 req-c5cb6fee-bff7-40ef-bb16-8f3fb40672b6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.635 186666 DEBUG nova.compute.manager [req-8ef2b62c-8b4d-465a-8ccd-68bf630aecb7 req-c5cb6fee-bff7-40ef-bb16-8f3fb40672b6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] No waiting events found dispatching network-vif-unplugged-61c38173-e58d-41f4-a6e7-aa24531b3ed2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:29:05 compute-0 nova_compute[186662]: 2026-02-19 19:29:05.635 186666 DEBUG nova.compute.manager [req-8ef2b62c-8b4d-465a-8ccd-68bf630aecb7 req-c5cb6fee-bff7-40ef-bb16-8f3fb40672b6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Received event network-vif-unplugged-61c38173-e58d-41f4-a6e7-aa24531b3ed2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.132 186666 DEBUG nova.virt.libvirt.vif [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:26:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1665463927',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1665463927',id=7,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:27:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-6oueca0b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:27:11Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=9176d9ab-648d-4303-8d54-b208a7e1d395,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "address": "fa:16:3e:12:3f:ba", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c38173-e5", "ovs_interfaceid": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.133 186666 DEBUG nova.network.os_vif_util [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converting VIF {"id": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "address": "fa:16:3e:12:3f:ba", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c38173-e5", "ovs_interfaceid": "61c38173-e58d-41f4-a6e7-aa24531b3ed2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.133 186666 DEBUG nova.network.os_vif_util [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:3f:ba,bridge_name='br-int',has_traffic_filtering=True,id=61c38173-e58d-41f4-a6e7-aa24531b3ed2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c38173-e5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.133 186666 DEBUG os_vif [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:3f:ba,bridge_name='br-int',has_traffic_filtering=True,id=61c38173-e58d-41f4-a6e7-aa24531b3ed2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c38173-e5') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.134 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.135 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61c38173-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.186 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.187 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.188 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.188 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=9297d20e-e7fb-4f35-b382-acc7a6ea7538) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.189 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.191 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.193 186666 INFO os_vif [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:3f:ba,bridge_name='br-int',has_traffic_filtering=True,id=61c38173-e58d-41f4-a6e7-aa24531b3ed2,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c38173-e5')
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.193 186666 INFO nova.virt.libvirt.driver [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Deleting instance files /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395_del
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.194 186666 INFO nova.virt.libvirt.driver [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Deletion of /var/lib/nova/instances/9176d9ab-648d-4303-8d54-b208a7e1d395_del complete
Feb 19 19:29:06 compute-0 sshd-session[210494]: Invalid user teamspeak3 from 45.169.200.254 port 41380
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.704 186666 INFO nova.compute.manager [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Took 1.32 seconds to destroy the instance on the hypervisor.
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.704 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.705 186666 DEBUG nova.compute.manager [-] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.705 186666 DEBUG nova.network.neutron [-] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:29:06 compute-0 nova_compute[186662]: 2026-02-19 19:29:06.705 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:29:06 compute-0 sshd-session[210494]: Received disconnect from 45.169.200.254 port 41380:11: Bye Bye [preauth]
Feb 19 19:29:06 compute-0 sshd-session[210494]: Disconnected from invalid user teamspeak3 45.169.200.254 port 41380 [preauth]
Feb 19 19:29:07 compute-0 nova_compute[186662]: 2026-02-19 19:29:07.335 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:07 compute-0 nova_compute[186662]: 2026-02-19 19:29:07.604 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:29:07 compute-0 nova_compute[186662]: 2026-02-19 19:29:07.693 186666 DEBUG nova.compute.manager [req-015e6f41-d6fd-42e6-886d-d2fe44f4cd89 req-e72f89fd-5200-4045-9d0d-4aaf53982479 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Received event network-vif-unplugged-61c38173-e58d-41f4-a6e7-aa24531b3ed2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:07 compute-0 nova_compute[186662]: 2026-02-19 19:29:07.694 186666 DEBUG oslo_concurrency.lockutils [req-015e6f41-d6fd-42e6-886d-d2fe44f4cd89 req-e72f89fd-5200-4045-9d0d-4aaf53982479 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:07 compute-0 nova_compute[186662]: 2026-02-19 19:29:07.695 186666 DEBUG oslo_concurrency.lockutils [req-015e6f41-d6fd-42e6-886d-d2fe44f4cd89 req-e72f89fd-5200-4045-9d0d-4aaf53982479 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:07 compute-0 nova_compute[186662]: 2026-02-19 19:29:07.695 186666 DEBUG oslo_concurrency.lockutils [req-015e6f41-d6fd-42e6-886d-d2fe44f4cd89 req-e72f89fd-5200-4045-9d0d-4aaf53982479 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:07 compute-0 nova_compute[186662]: 2026-02-19 19:29:07.696 186666 DEBUG nova.compute.manager [req-015e6f41-d6fd-42e6-886d-d2fe44f4cd89 req-e72f89fd-5200-4045-9d0d-4aaf53982479 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] No waiting events found dispatching network-vif-unplugged-61c38173-e58d-41f4-a6e7-aa24531b3ed2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:29:07 compute-0 nova_compute[186662]: 2026-02-19 19:29:07.696 186666 DEBUG nova.compute.manager [req-015e6f41-d6fd-42e6-886d-d2fe44f4cd89 req-e72f89fd-5200-4045-9d0d-4aaf53982479 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Received event network-vif-unplugged-61c38173-e58d-41f4-a6e7-aa24531b3ed2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:29:09 compute-0 nova_compute[186662]: 2026-02-19 19:29:09.135 186666 DEBUG nova.network.neutron [-] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:29:09 compute-0 nova_compute[186662]: 2026-02-19 19:29:09.641 186666 INFO nova.compute.manager [-] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Took 2.94 seconds to deallocate network for instance.
Feb 19 19:29:09 compute-0 nova_compute[186662]: 2026-02-19 19:29:09.751 186666 DEBUG nova.compute.manager [req-75afc0f8-8e84-4372-bd30-81f69c41174f req-e302950a-e3ad-4ed4-a217-ceeaa117e996 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 9176d9ab-648d-4303-8d54-b208a7e1d395] Received event network-vif-deleted-61c38173-e58d-41f4-a6e7-aa24531b3ed2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:10 compute-0 nova_compute[186662]: 2026-02-19 19:29:10.164 186666 DEBUG oslo_concurrency.lockutils [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:10 compute-0 nova_compute[186662]: 2026-02-19 19:29:10.165 186666 DEBUG oslo_concurrency.lockutils [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:10 compute-0 nova_compute[186662]: 2026-02-19 19:29:10.274 186666 DEBUG nova.compute.provider_tree [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:29:10 compute-0 nova_compute[186662]: 2026-02-19 19:29:10.782 186666 DEBUG nova.scheduler.client.report [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:29:11 compute-0 nova_compute[186662]: 2026-02-19 19:29:11.190 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:11 compute-0 nova_compute[186662]: 2026-02-19 19:29:11.295 186666 DEBUG oslo_concurrency.lockutils [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.130s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:11 compute-0 nova_compute[186662]: 2026-02-19 19:29:11.319 186666 INFO nova.scheduler.client.report [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Deleted allocations for instance 9176d9ab-648d-4303-8d54-b208a7e1d395
Feb 19 19:29:12 compute-0 nova_compute[186662]: 2026-02-19 19:29:12.337 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:12 compute-0 nova_compute[186662]: 2026-02-19 19:29:12.351 186666 DEBUG oslo_concurrency.lockutils [None req-2b39b2ab-1551-4b61-836c-f8591e9ce382 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "9176d9ab-648d-4303-8d54-b208a7e1d395" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.501s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:12 compute-0 nova_compute[186662]: 2026-02-19 19:29:12.537 186666 DEBUG oslo_concurrency.lockutils [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "7874c50f-bdd7-48d9-bbab-6db09f173178" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:12 compute-0 nova_compute[186662]: 2026-02-19 19:29:12.538 186666 DEBUG oslo_concurrency.lockutils [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "7874c50f-bdd7-48d9-bbab-6db09f173178" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:12 compute-0 nova_compute[186662]: 2026-02-19 19:29:12.538 186666 DEBUG oslo_concurrency.lockutils [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "7874c50f-bdd7-48d9-bbab-6db09f173178-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:12 compute-0 nova_compute[186662]: 2026-02-19 19:29:12.539 186666 DEBUG oslo_concurrency.lockutils [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "7874c50f-bdd7-48d9-bbab-6db09f173178-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:12 compute-0 nova_compute[186662]: 2026-02-19 19:29:12.539 186666 DEBUG oslo_concurrency.lockutils [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "7874c50f-bdd7-48d9-bbab-6db09f173178-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:12 compute-0 nova_compute[186662]: 2026-02-19 19:29:12.556 186666 INFO nova.compute.manager [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Terminating instance
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.074 186666 DEBUG nova.compute.manager [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:29:13 compute-0 kernel: tap73ee194b-90 (unregistering): left promiscuous mode
Feb 19 19:29:13 compute-0 NetworkManager[56519]: <info>  [1771529353.1207] device (tap73ee194b-90): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:29:13 compute-0 ovn_controller[96653]: 2026-02-19T19:29:13Z|00085|binding|INFO|Releasing lport 73ee194b-90a5-4834-81db-341af15d3ad4 from this chassis (sb_readonly=0)
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.131 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:13 compute-0 ovn_controller[96653]: 2026-02-19T19:29:13Z|00086|binding|INFO|Setting lport 73ee194b-90a5-4834-81db-341af15d3ad4 down in Southbound
Feb 19 19:29:13 compute-0 ovn_controller[96653]: 2026-02-19T19:29:13Z|00087|binding|INFO|Removing iface tap73ee194b-90 ovn-installed in OVS
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.135 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.141 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.144 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:d2:d7 10.100.0.5'], port_security=['fa:16:3e:41:d2:d7 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7874c50f-bdd7-48d9-bbab-6db09f173178', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '15', 'neutron:security_group_ids': '286c3adc-3464-487a-ae0e-8231188cef8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46bbe139-eec8-4de4-bfdc-b507967fb452, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=73ee194b-90a5-4834-81db-341af15d3ad4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.146 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 73ee194b-90a5-4834-81db-341af15d3ad4 in datapath 23744514-9581-483b-ba8d-38106bcd89ef unbound from our chassis
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.148 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 23744514-9581-483b-ba8d-38106bcd89ef
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.167 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[015b3edb-9546-42bd-a8a6-3ed46fe99062]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:13 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Feb 19 19:29:13 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 3.519s CPU time.
Feb 19 19:29:13 compute-0 systemd-machined[156014]: Machine qemu-6-instance-00000006 terminated.
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.209 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[7c87d74a-3f7d-401c-b55e-2789f7353e5d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.213 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[04dc56a9-eb31-4234-86e5-db9012801d27]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.246 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[6db1ee17-e719-47bb-b447-bf56f0055edb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:13 compute-0 podman[210498]: 2026-02-19 19:29:13.258810136 +0000 UTC m=+0.102375770 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.266 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[15f3450b-cb38-4833-a4fe-9c7c92109413]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23744514-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:f0:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 21, 'rx_bytes': 1792, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 21, 'rx_bytes': 1792, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341570, 'reachable_time': 39578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210532, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.285 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ce04f46d-a7e2-4741-90a5-ebaa9f9157f3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341576, 'tstamp': 341576}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210533, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341578, 'tstamp': 341578}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210533, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.287 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23744514-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.332 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.333 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.339 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.341 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23744514-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.341 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.342 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap23744514-90, col_values=(('external_ids', {'iface-id': '25b16785-ed71-4ba6-a91b-c4dcee5ff875'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.342 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:29:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:13.344 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[401e47c6-d914-4249-a9cd-a5221e1d2a31]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-23744514-9581-483b-ba8d-38106bcd89ef\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 23744514-9581-483b-ba8d-38106bcd89ef\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.374 186666 INFO nova.virt.libvirt.driver [-] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Instance destroyed successfully.
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.374 186666 DEBUG nova.objects.instance [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lazy-loading 'resources' on Instance uuid 7874c50f-bdd7-48d9-bbab-6db09f173178 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.399 186666 DEBUG nova.compute.manager [req-88a270af-fb9e-4ede-bbff-794de7044ef4 req-8e87a4f2-ff38-4f07-b64a-32bac581dfca 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Received event network-vif-unplugged-73ee194b-90a5-4834-81db-341af15d3ad4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.400 186666 DEBUG oslo_concurrency.lockutils [req-88a270af-fb9e-4ede-bbff-794de7044ef4 req-8e87a4f2-ff38-4f07-b64a-32bac581dfca 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "7874c50f-bdd7-48d9-bbab-6db09f173178-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.400 186666 DEBUG oslo_concurrency.lockutils [req-88a270af-fb9e-4ede-bbff-794de7044ef4 req-8e87a4f2-ff38-4f07-b64a-32bac581dfca 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "7874c50f-bdd7-48d9-bbab-6db09f173178-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.400 186666 DEBUG oslo_concurrency.lockutils [req-88a270af-fb9e-4ede-bbff-794de7044ef4 req-8e87a4f2-ff38-4f07-b64a-32bac581dfca 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "7874c50f-bdd7-48d9-bbab-6db09f173178-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.400 186666 DEBUG nova.compute.manager [req-88a270af-fb9e-4ede-bbff-794de7044ef4 req-8e87a4f2-ff38-4f07-b64a-32bac581dfca 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] No waiting events found dispatching network-vif-unplugged-73ee194b-90a5-4834-81db-341af15d3ad4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.400 186666 DEBUG nova.compute.manager [req-88a270af-fb9e-4ede-bbff-794de7044ef4 req-8e87a4f2-ff38-4f07-b64a-32bac581dfca 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Received event network-vif-unplugged-73ee194b-90a5-4834-81db-341af15d3ad4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.881 186666 DEBUG nova.virt.libvirt.vif [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-02-19T19:26:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-105068416',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-105068416',id=6,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:26:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-nno39c23',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',clean_attempts='1',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:28:29Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=7874c50f-bdd7-48d9-bbab-6db09f173178,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "73ee194b-90a5-4834-81db-341af15d3ad4", "address": "fa:16:3e:41:d2:d7", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ee194b-90", "ovs_interfaceid": "73ee194b-90a5-4834-81db-341af15d3ad4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.881 186666 DEBUG nova.network.os_vif_util [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converting VIF {"id": "73ee194b-90a5-4834-81db-341af15d3ad4", "address": "fa:16:3e:41:d2:d7", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ee194b-90", "ovs_interfaceid": "73ee194b-90a5-4834-81db-341af15d3ad4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.883 186666 DEBUG nova.network.os_vif_util [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:41:d2:d7,bridge_name='br-int',has_traffic_filtering=True,id=73ee194b-90a5-4834-81db-341af15d3ad4,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ee194b-90') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.883 186666 DEBUG os_vif [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:d2:d7,bridge_name='br-int',has_traffic_filtering=True,id=73ee194b-90a5-4834-81db-341af15d3ad4,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ee194b-90') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.885 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.886 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73ee194b-90, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.888 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.890 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.891 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.891 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=3909986a-e279-40ac-af95-1b48ac5c3444) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.892 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.894 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.896 186666 INFO os_vif [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:d2:d7,bridge_name='br-int',has_traffic_filtering=True,id=73ee194b-90a5-4834-81db-341af15d3ad4,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ee194b-90')
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.897 186666 INFO nova.virt.libvirt.driver [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Deleting instance files /var/lib/nova/instances/7874c50f-bdd7-48d9-bbab-6db09f173178_del
Feb 19 19:29:13 compute-0 nova_compute[186662]: 2026-02-19 19:29:13.898 186666 INFO nova.virt.libvirt.driver [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Deletion of /var/lib/nova/instances/7874c50f-bdd7-48d9-bbab-6db09f173178_del complete
Feb 19 19:29:14 compute-0 nova_compute[186662]: 2026-02-19 19:29:14.411 186666 INFO nova.compute.manager [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Took 1.34 seconds to destroy the instance on the hypervisor.
Feb 19 19:29:14 compute-0 nova_compute[186662]: 2026-02-19 19:29:14.412 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:29:14 compute-0 nova_compute[186662]: 2026-02-19 19:29:14.412 186666 DEBUG nova.compute.manager [-] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:29:14 compute-0 nova_compute[186662]: 2026-02-19 19:29:14.413 186666 DEBUG nova.network.neutron [-] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:29:14 compute-0 nova_compute[186662]: 2026-02-19 19:29:14.413 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:29:14 compute-0 nova_compute[186662]: 2026-02-19 19:29:14.504 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:29:14 compute-0 nova_compute[186662]: 2026-02-19 19:29:14.505 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:29:14 compute-0 nova_compute[186662]: 2026-02-19 19:29:14.597 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:29:14 compute-0 nova_compute[186662]: 2026-02-19 19:29:14.934 186666 DEBUG nova.compute.manager [req-8ef13224-0450-4f51-bc6b-bdce0f61cc6c req-ad25adeb-5981-41e7-ab0f-c76c975ea3f6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Received event network-vif-deleted-73ee194b-90a5-4834-81db-341af15d3ad4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:14 compute-0 nova_compute[186662]: 2026-02-19 19:29:14.934 186666 INFO nova.compute.manager [req-8ef13224-0450-4f51-bc6b-bdce0f61cc6c req-ad25adeb-5981-41e7-ab0f-c76c975ea3f6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Neutron deleted interface 73ee194b-90a5-4834-81db-341af15d3ad4; detaching it from the instance and deleting it from the info cache
Feb 19 19:29:14 compute-0 nova_compute[186662]: 2026-02-19 19:29:14.934 186666 DEBUG nova.network.neutron [req-8ef13224-0450-4f51-bc6b-bdce0f61cc6c req-ad25adeb-5981-41e7-ab0f-c76c975ea3f6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.017 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.017 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.018 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.018 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.018 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.019 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.019 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.020 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.373 186666 DEBUG nova.network.neutron [-] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.442 186666 DEBUG nova.compute.manager [req-8ef13224-0450-4f51-bc6b-bdce0f61cc6c req-ad25adeb-5981-41e7-ab0f-c76c975ea3f6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Detach interface failed, port_id=73ee194b-90a5-4834-81db-341af15d3ad4, reason: Instance 7874c50f-bdd7-48d9-bbab-6db09f173178 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.455 186666 DEBUG nova.compute.manager [req-5215dc43-58fc-496a-b969-29ae5e2f92e5 req-eaf7ea7b-a03e-4789-8b1c-1f79d8918506 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Received event network-vif-unplugged-73ee194b-90a5-4834-81db-341af15d3ad4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.456 186666 DEBUG oslo_concurrency.lockutils [req-5215dc43-58fc-496a-b969-29ae5e2f92e5 req-eaf7ea7b-a03e-4789-8b1c-1f79d8918506 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "7874c50f-bdd7-48d9-bbab-6db09f173178-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.456 186666 DEBUG oslo_concurrency.lockutils [req-5215dc43-58fc-496a-b969-29ae5e2f92e5 req-eaf7ea7b-a03e-4789-8b1c-1f79d8918506 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "7874c50f-bdd7-48d9-bbab-6db09f173178-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.457 186666 DEBUG oslo_concurrency.lockutils [req-5215dc43-58fc-496a-b969-29ae5e2f92e5 req-eaf7ea7b-a03e-4789-8b1c-1f79d8918506 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "7874c50f-bdd7-48d9-bbab-6db09f173178-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.457 186666 DEBUG nova.compute.manager [req-5215dc43-58fc-496a-b969-29ae5e2f92e5 req-eaf7ea7b-a03e-4789-8b1c-1f79d8918506 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] No waiting events found dispatching network-vif-unplugged-73ee194b-90a5-4834-81db-341af15d3ad4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.457 186666 DEBUG nova.compute.manager [req-5215dc43-58fc-496a-b969-29ae5e2f92e5 req-eaf7ea7b-a03e-4789-8b1c-1f79d8918506 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Received event network-vif-unplugged-73ee194b-90a5-4834-81db-341af15d3ad4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.533 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.534 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.534 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.534 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:29:15 compute-0 nova_compute[186662]: 2026-02-19 19:29:15.879 186666 INFO nova.compute.manager [-] [instance: 7874c50f-bdd7-48d9-bbab-6db09f173178] Took 1.47 seconds to deallocate network for instance.
Feb 19 19:29:16 compute-0 nova_compute[186662]: 2026-02-19 19:29:16.395 186666 DEBUG oslo_concurrency.lockutils [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:16 compute-0 nova_compute[186662]: 2026-02-19 19:29:16.396 186666 DEBUG oslo_concurrency.lockutils [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:16 compute-0 nova_compute[186662]: 2026-02-19 19:29:16.402 186666 DEBUG oslo_concurrency.lockutils [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:16 compute-0 nova_compute[186662]: 2026-02-19 19:29:16.440 186666 INFO nova.scheduler.client.report [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Deleted allocations for instance 7874c50f-bdd7-48d9-bbab-6db09f173178
Feb 19 19:29:16 compute-0 nova_compute[186662]: 2026-02-19 19:29:16.589 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:29:16 compute-0 nova_compute[186662]: 2026-02-19 19:29:16.645 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:29:16 compute-0 nova_compute[186662]: 2026-02-19 19:29:16.646 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:29:16 compute-0 nova_compute[186662]: 2026-02-19 19:29:16.721 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:29:16 compute-0 nova_compute[186662]: 2026-02-19 19:29:16.728 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:29:16 compute-0 nova_compute[186662]: 2026-02-19 19:29:16.784 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:29:16 compute-0 nova_compute[186662]: 2026-02-19 19:29:16.785 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:29:16 compute-0 nova_compute[186662]: 2026-02-19 19:29:16.854 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:29:17 compute-0 nova_compute[186662]: 2026-02-19 19:29:17.062 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:29:17 compute-0 nova_compute[186662]: 2026-02-19 19:29:17.064 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:29:17 compute-0 nova_compute[186662]: 2026-02-19 19:29:17.076 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:29:17 compute-0 nova_compute[186662]: 2026-02-19 19:29:17.076 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5522MB free_disk=72.91865539550781GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:29:17 compute-0 nova_compute[186662]: 2026-02-19 19:29:17.076 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:17 compute-0 nova_compute[186662]: 2026-02-19 19:29:17.077 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:17 compute-0 nova_compute[186662]: 2026-02-19 19:29:17.339 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:17 compute-0 nova_compute[186662]: 2026-02-19 19:29:17.471 186666 DEBUG oslo_concurrency.lockutils [None req-c5c0ef22-6719-4341-bd1c-473a3dd3ecb8 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "7874c50f-bdd7-48d9-bbab-6db09f173178" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.934s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:18 compute-0 nova_compute[186662]: 2026-02-19 19:29:18.115 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 72cc675e-4d5d-48c5-8c12-f9a42e168294 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:29:18 compute-0 nova_compute[186662]: 2026-02-19 19:29:18.116 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance df77e346-76b1-4b06-8611-44d3ac9fc3ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:29:18 compute-0 nova_compute[186662]: 2026-02-19 19:29:18.116 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:29:18 compute-0 nova_compute[186662]: 2026-02-19 19:29:18.116 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:29:17 up  1:00,  0 user,  load average: 0.41, 0.37, 0.38\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_84d22a8926d9401eb98cf092c0899a62': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:29:18 compute-0 nova_compute[186662]: 2026-02-19 19:29:18.137 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing inventories for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Feb 19 19:29:18 compute-0 nova_compute[186662]: 2026-02-19 19:29:18.153 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating ProviderTree inventory for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Feb 19 19:29:18 compute-0 nova_compute[186662]: 2026-02-19 19:29:18.153 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:29:18 compute-0 nova_compute[186662]: 2026-02-19 19:29:18.166 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing aggregate associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Feb 19 19:29:18 compute-0 nova_compute[186662]: 2026-02-19 19:29:18.191 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing trait associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_SSE2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,HW_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_USB _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Feb 19 19:29:18 compute-0 nova_compute[186662]: 2026-02-19 19:29:18.243 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:29:18 compute-0 nova_compute[186662]: 2026-02-19 19:29:18.750 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:29:18 compute-0 nova_compute[186662]: 2026-02-19 19:29:18.893 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.120 186666 DEBUG oslo_concurrency.lockutils [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "72cc675e-4d5d-48c5-8c12-f9a42e168294" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.121 186666 DEBUG oslo_concurrency.lockutils [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.121 186666 DEBUG oslo_concurrency.lockutils [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.122 186666 DEBUG oslo_concurrency.lockutils [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.122 186666 DEBUG oslo_concurrency.lockutils [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.137 186666 INFO nova.compute.manager [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Terminating instance
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.259 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.259 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.182s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.656 186666 DEBUG nova.compute.manager [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:29:19 compute-0 kernel: tap52549f9d-4d (unregistering): left promiscuous mode
Feb 19 19:29:19 compute-0 NetworkManager[56519]: <info>  [1771529359.6840] device (tap52549f9d-4d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:29:19 compute-0 ovn_controller[96653]: 2026-02-19T19:29:19Z|00088|binding|INFO|Releasing lport 52549f9d-4d31-4b47-a4ed-063a73c7fd04 from this chassis (sb_readonly=0)
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.693 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:19 compute-0 ovn_controller[96653]: 2026-02-19T19:29:19Z|00089|binding|INFO|Setting lport 52549f9d-4d31-4b47-a4ed-063a73c7fd04 down in Southbound
Feb 19 19:29:19 compute-0 ovn_controller[96653]: 2026-02-19T19:29:19Z|00090|binding|INFO|Removing iface tap52549f9d-4d ovn-installed in OVS
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.699 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.705 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.714 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:c3:31 10.100.0.8'], port_security=['fa:16:3e:f4:c3:31 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '72cc675e-4d5d-48c5-8c12-f9a42e168294', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '5', 'neutron:security_group_ids': '286c3adc-3464-487a-ae0e-8231188cef8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46bbe139-eec8-4de4-bfdc-b507967fb452, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=52549f9d-4d31-4b47-a4ed-063a73c7fd04) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.716 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 52549f9d-4d31-4b47-a4ed-063a73c7fd04 in datapath 23744514-9581-483b-ba8d-38106bcd89ef unbound from our chassis
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.717 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 23744514-9581-483b-ba8d-38106bcd89ef
Feb 19 19:29:19 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Deactivated successfully.
Feb 19 19:29:19 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Consumed 19.219s CPU time.
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.733 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e99d0df8-ca9d-4010-a568-2fce8d3c0dde]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:19 compute-0 systemd-machined[156014]: Machine qemu-2-instance-00000005 terminated.
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.766 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[f6cce200-4713-4666-94cc-60a46b70f391]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.768 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[ecd42ccd-4dc7-415e-8da6-0b9830cc1539]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.803 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[982a3fa4-974c-4d95-9e41-bc2fce2d59d0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.820 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[602ca669-f934-458f-ad4c-9691ad358d32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23744514-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:f0:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 23, 'rx_bytes': 1792, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 23, 'rx_bytes': 1792, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341570, 'reachable_time': 39578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210576, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.836 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c822d0df-5309-4f6f-8722-3a9da9dd7133]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341576, 'tstamp': 341576}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210577, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap23744514-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341578, 'tstamp': 341578}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210577, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.837 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23744514-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.839 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.844 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.845 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23744514-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.845 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.845 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap23744514-90, col_values=(('external_ids', {'iface-id': '25b16785-ed71-4ba6-a91b-c4dcee5ff875'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.846 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:29:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:19.847 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[11ae3d38-2837-42d9-a8b8-253fa43c04b0]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-23744514-9581-483b-ba8d-38106bcd89ef\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 23744514-9581-483b-ba8d-38106bcd89ef\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.915 186666 INFO nova.virt.libvirt.driver [-] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Instance destroyed successfully.
Feb 19 19:29:19 compute-0 nova_compute[186662]: 2026-02-19 19:29:19.916 186666 DEBUG nova.objects.instance [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lazy-loading 'resources' on Instance uuid 72cc675e-4d5d-48c5-8c12-f9a42e168294 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.403 186666 DEBUG nova.compute.manager [req-b07cd628-0763-4b1e-afe5-66e2516d86b4 req-96414416-e90a-4812-94f8-a40fe06a7ab1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Received event network-vif-unplugged-52549f9d-4d31-4b47-a4ed-063a73c7fd04 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.404 186666 DEBUG oslo_concurrency.lockutils [req-b07cd628-0763-4b1e-afe5-66e2516d86b4 req-96414416-e90a-4812-94f8-a40fe06a7ab1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.404 186666 DEBUG oslo_concurrency.lockutils [req-b07cd628-0763-4b1e-afe5-66e2516d86b4 req-96414416-e90a-4812-94f8-a40fe06a7ab1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.405 186666 DEBUG oslo_concurrency.lockutils [req-b07cd628-0763-4b1e-afe5-66e2516d86b4 req-96414416-e90a-4812-94f8-a40fe06a7ab1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.405 186666 DEBUG nova.compute.manager [req-b07cd628-0763-4b1e-afe5-66e2516d86b4 req-96414416-e90a-4812-94f8-a40fe06a7ab1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] No waiting events found dispatching network-vif-unplugged-52549f9d-4d31-4b47-a4ed-063a73c7fd04 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.406 186666 DEBUG nova.compute.manager [req-b07cd628-0763-4b1e-afe5-66e2516d86b4 req-96414416-e90a-4812-94f8-a40fe06a7ab1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Received event network-vif-unplugged-52549f9d-4d31-4b47-a4ed-063a73c7fd04 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.423 186666 DEBUG nova.virt.libvirt.vif [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:25:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1820161035',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1820161035',id=5,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:25:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-x5x8d73p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:25:47Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=72cc675e-4d5d-48c5-8c12-f9a42e168294,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "address": "fa:16:3e:f4:c3:31", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52549f9d-4d", "ovs_interfaceid": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.424 186666 DEBUG nova.network.os_vif_util [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converting VIF {"id": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "address": "fa:16:3e:f4:c3:31", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52549f9d-4d", "ovs_interfaceid": "52549f9d-4d31-4b47-a4ed-063a73c7fd04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.426 186666 DEBUG nova.network.os_vif_util [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:c3:31,bridge_name='br-int',has_traffic_filtering=True,id=52549f9d-4d31-4b47-a4ed-063a73c7fd04,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52549f9d-4d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.426 186666 DEBUG os_vif [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:c3:31,bridge_name='br-int',has_traffic_filtering=True,id=52549f9d-4d31-4b47-a4ed-063a73c7fd04,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52549f9d-4d') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.430 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.431 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52549f9d-4d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.432 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.435 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.436 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.437 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=ed894571-31e8-44fc-b324-b9e039599c75) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.438 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.440 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.443 186666 INFO os_vif [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:c3:31,bridge_name='br-int',has_traffic_filtering=True,id=52549f9d-4d31-4b47-a4ed-063a73c7fd04,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52549f9d-4d')
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.444 186666 INFO nova.virt.libvirt.driver [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Deleting instance files /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294_del
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.445 186666 INFO nova.virt.libvirt.driver [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Deletion of /var/lib/nova/instances/72cc675e-4d5d-48c5-8c12-f9a42e168294_del complete
Feb 19 19:29:20 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:20.463 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:29:20 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:20.463 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.463 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.962 186666 INFO nova.compute.manager [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Took 1.31 seconds to destroy the instance on the hypervisor.
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.963 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.963 186666 DEBUG nova.compute.manager [-] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.963 186666 DEBUG nova.network.neutron [-] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:29:20 compute-0 nova_compute[186662]: 2026-02-19 19:29:20.964 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:29:21 compute-0 nova_compute[186662]: 2026-02-19 19:29:21.139 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:29:21 compute-0 nova_compute[186662]: 2026-02-19 19:29:21.923 186666 DEBUG nova.network.neutron [-] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:29:22 compute-0 nova_compute[186662]: 2026-02-19 19:29:22.340 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:22 compute-0 nova_compute[186662]: 2026-02-19 19:29:22.440 186666 INFO nova.compute.manager [-] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Took 1.48 seconds to deallocate network for instance.
Feb 19 19:29:22 compute-0 nova_compute[186662]: 2026-02-19 19:29:22.477 186666 DEBUG nova.compute.manager [req-924262de-3cbb-4039-8b34-896625eaccd4 req-0325722e-f1f0-434f-bdea-9ccd20968136 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Received event network-vif-unplugged-52549f9d-4d31-4b47-a4ed-063a73c7fd04 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:22 compute-0 nova_compute[186662]: 2026-02-19 19:29:22.478 186666 DEBUG oslo_concurrency.lockutils [req-924262de-3cbb-4039-8b34-896625eaccd4 req-0325722e-f1f0-434f-bdea-9ccd20968136 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:22 compute-0 nova_compute[186662]: 2026-02-19 19:29:22.478 186666 DEBUG oslo_concurrency.lockutils [req-924262de-3cbb-4039-8b34-896625eaccd4 req-0325722e-f1f0-434f-bdea-9ccd20968136 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:22 compute-0 nova_compute[186662]: 2026-02-19 19:29:22.478 186666 DEBUG oslo_concurrency.lockutils [req-924262de-3cbb-4039-8b34-896625eaccd4 req-0325722e-f1f0-434f-bdea-9ccd20968136 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:22 compute-0 nova_compute[186662]: 2026-02-19 19:29:22.478 186666 DEBUG nova.compute.manager [req-924262de-3cbb-4039-8b34-896625eaccd4 req-0325722e-f1f0-434f-bdea-9ccd20968136 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] No waiting events found dispatching network-vif-unplugged-52549f9d-4d31-4b47-a4ed-063a73c7fd04 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:29:22 compute-0 nova_compute[186662]: 2026-02-19 19:29:22.479 186666 DEBUG nova.compute.manager [req-924262de-3cbb-4039-8b34-896625eaccd4 req-0325722e-f1f0-434f-bdea-9ccd20968136 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Received event network-vif-unplugged-52549f9d-4d31-4b47-a4ed-063a73c7fd04 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:29:22 compute-0 nova_compute[186662]: 2026-02-19 19:29:22.479 186666 DEBUG nova.compute.manager [req-924262de-3cbb-4039-8b34-896625eaccd4 req-0325722e-f1f0-434f-bdea-9ccd20968136 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 72cc675e-4d5d-48c5-8c12-f9a42e168294] Received event network-vif-deleted-52549f9d-4d31-4b47-a4ed-063a73c7fd04 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:22 compute-0 nova_compute[186662]: 2026-02-19 19:29:22.960 186666 DEBUG oslo_concurrency.lockutils [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:22 compute-0 nova_compute[186662]: 2026-02-19 19:29:22.960 186666 DEBUG oslo_concurrency.lockutils [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:23 compute-0 nova_compute[186662]: 2026-02-19 19:29:23.039 186666 DEBUG nova.compute.provider_tree [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:29:23 compute-0 nova_compute[186662]: 2026-02-19 19:29:23.546 186666 DEBUG nova.scheduler.client.report [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:29:24 compute-0 nova_compute[186662]: 2026-02-19 19:29:24.062 186666 DEBUG oslo_concurrency.lockutils [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:24 compute-0 nova_compute[186662]: 2026-02-19 19:29:24.093 186666 INFO nova.scheduler.client.report [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Deleted allocations for instance 72cc675e-4d5d-48c5-8c12-f9a42e168294
Feb 19 19:29:25 compute-0 nova_compute[186662]: 2026-02-19 19:29:25.117 186666 DEBUG oslo_concurrency.lockutils [None req-2eb06e94-75e0-4103-a6d5-ef568d950329 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "72cc675e-4d5d-48c5-8c12-f9a42e168294" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.996s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:25 compute-0 nova_compute[186662]: 2026-02-19 19:29:25.438 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:25 compute-0 nova_compute[186662]: 2026-02-19 19:29:25.875 186666 DEBUG oslo_concurrency.lockutils [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:25 compute-0 nova_compute[186662]: 2026-02-19 19:29:25.876 186666 DEBUG oslo_concurrency.lockutils [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:25 compute-0 nova_compute[186662]: 2026-02-19 19:29:25.876 186666 DEBUG oslo_concurrency.lockutils [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:25 compute-0 nova_compute[186662]: 2026-02-19 19:29:25.877 186666 DEBUG oslo_concurrency.lockutils [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:25 compute-0 nova_compute[186662]: 2026-02-19 19:29:25.877 186666 DEBUG oslo_concurrency.lockutils [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:25 compute-0 nova_compute[186662]: 2026-02-19 19:29:25.889 186666 INFO nova.compute.manager [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Terminating instance
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.408 186666 DEBUG nova.compute.manager [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:29:26 compute-0 kernel: tape43829dc-75 (unregistering): left promiscuous mode
Feb 19 19:29:26 compute-0 NetworkManager[56519]: <info>  [1771529366.4341] device (tape43829dc-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:29:26 compute-0 ovn_controller[96653]: 2026-02-19T19:29:26Z|00091|binding|INFO|Releasing lport e43829dc-7578-4c37-87d3-5cbc96a2767f from this chassis (sb_readonly=0)
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.439 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:26 compute-0 ovn_controller[96653]: 2026-02-19T19:29:26Z|00092|binding|INFO|Setting lport e43829dc-7578-4c37-87d3-5cbc96a2767f down in Southbound
Feb 19 19:29:26 compute-0 ovn_controller[96653]: 2026-02-19T19:29:26Z|00093|binding|INFO|Removing iface tape43829dc-75 ovn-installed in OVS
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.443 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.447 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.454 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:fd:c8 10.100.0.11'], port_security=['fa:16:3e:0c:fd:c8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'df77e346-76b1-4b06-8611-44d3ac9fc3ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23744514-9581-483b-ba8d-38106bcd89ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84d22a8926d9401eb98cf092c0899a62', 'neutron:revision_number': '10', 'neutron:security_group_ids': '286c3adc-3464-487a-ae0e-8231188cef8f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46bbe139-eec8-4de4-bfdc-b507967fb452, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=e43829dc-7578-4c37-87d3-5cbc96a2767f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.455 105986 INFO neutron.agent.ovn.metadata.agent [-] Port e43829dc-7578-4c37-87d3-5cbc96a2767f in datapath 23744514-9581-483b-ba8d-38106bcd89ef unbound from our chassis
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.457 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 23744514-9581-483b-ba8d-38106bcd89ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.457 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[25a85f5d-2156-4805-8569-e6e9fa535329]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.458 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef namespace which is not needed anymore
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.465 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:26 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Deactivated successfully.
Feb 19 19:29:26 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Consumed 18.849s CPU time.
Feb 19 19:29:26 compute-0 systemd-machined[156014]: Machine qemu-3-instance-00000004 terminated.
Feb 19 19:29:26 compute-0 neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef[209192]: [NOTICE]   (209196) : haproxy version is 3.0.5-8e879a5
Feb 19 19:29:26 compute-0 neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef[209192]: [NOTICE]   (209196) : path to executable is /usr/sbin/haproxy
Feb 19 19:29:26 compute-0 neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef[209192]: [WARNING]  (209196) : Exiting Master process...
Feb 19 19:29:26 compute-0 podman[210623]: 2026-02-19 19:29:26.571274504 +0000 UTC m=+0.026336944 container kill 8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:29:26 compute-0 neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef[209192]: [ALERT]    (209196) : Current worker (209198) exited with code 143 (Terminated)
Feb 19 19:29:26 compute-0 neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef[209192]: [WARNING]  (209196) : All workers exited. Exiting... (0)
Feb 19 19:29:26 compute-0 systemd[1]: libpod-8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0.scope: Deactivated successfully.
Feb 19 19:29:26 compute-0 podman[210639]: 2026-02-19 19:29:26.612620686 +0000 UTC m=+0.024222002 container died 8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.629 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.634 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0-userdata-shm.mount: Deactivated successfully.
Feb 19 19:29:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-b4ab769aa4d319dcce232a1f0754f124605a3b2871b4ffe0e89eaf1baa5022e3-merged.mount: Deactivated successfully.
Feb 19 19:29:26 compute-0 podman[210639]: 2026-02-19 19:29:26.655446784 +0000 UTC m=+0.067048120 container cleanup 8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 19 19:29:26 compute-0 systemd[1]: libpod-conmon-8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0.scope: Deactivated successfully.
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.663 186666 INFO nova.virt.libvirt.driver [-] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Instance destroyed successfully.
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.663 186666 DEBUG nova.objects.instance [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lazy-loading 'resources' on Instance uuid df77e346-76b1-4b06-8611-44d3ac9fc3ef obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:29:26 compute-0 podman[210651]: 2026-02-19 19:29:26.675649239 +0000 UTC m=+0.065603036 container remove 8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.679 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[8faf33a5-c8c1-4736-b58f-82cc0e9adab9]: (4, ("Thu Feb 19 07:29:26 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef (8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0)\n8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0\nThu Feb 19 07:29:26 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef (8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0)\n8e547e8b1119dc1450f6dbd8ede72055b580def76a26047751a64a49519983b0\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.681 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d7dcebb8-461e-44ba-9c25-2bc990fee8aa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.681 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/23744514-9581-483b-ba8d-38106bcd89ef.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.682 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[2b394305-001d-4b6b-be4f-07e64287dd61]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.682 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23744514-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.684 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:26 compute-0 kernel: tap23744514-90: left promiscuous mode
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.689 186666 DEBUG nova.compute.manager [req-ed5919ab-d103-4cae-8d12-3efa825eb588 req-e13ec189-6f0e-4144-995f-ce5aa163cc46 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received event network-vif-unplugged-e43829dc-7578-4c37-87d3-5cbc96a2767f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.689 186666 DEBUG oslo_concurrency.lockutils [req-ed5919ab-d103-4cae-8d12-3efa825eb588 req-e13ec189-6f0e-4144-995f-ce5aa163cc46 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.689 186666 DEBUG oslo_concurrency.lockutils [req-ed5919ab-d103-4cae-8d12-3efa825eb588 req-e13ec189-6f0e-4144-995f-ce5aa163cc46 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.689 186666 DEBUG oslo_concurrency.lockutils [req-ed5919ab-d103-4cae-8d12-3efa825eb588 req-e13ec189-6f0e-4144-995f-ce5aa163cc46 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.689 186666 DEBUG nova.compute.manager [req-ed5919ab-d103-4cae-8d12-3efa825eb588 req-e13ec189-6f0e-4144-995f-ce5aa163cc46 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] No waiting events found dispatching network-vif-unplugged-e43829dc-7578-4c37-87d3-5cbc96a2767f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.690 186666 DEBUG nova.compute.manager [req-ed5919ab-d103-4cae-8d12-3efa825eb588 req-e13ec189-6f0e-4144-995f-ce5aa163cc46 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received event network-vif-unplugged-e43829dc-7578-4c37-87d3-5cbc96a2767f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.691 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:26 compute-0 nova_compute[186662]: 2026-02-19 19:29:26.691 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.692 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f45671c8-95fb-4a0c-bb8e-e445926d6f80]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.706 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e3fd9b-86f2-4a53-981f-37669832b2e1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.706 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7c67af01-d10e-4f03-a785-cf5a33d4d02f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.718 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ff73425d-515e-4667-a09f-f57f151383fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341566, 'reachable_time': 23451, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210688, 'error': None, 'target': 'ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.720 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-23744514-9581-483b-ba8d-38106bcd89ef deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:29:26 compute-0 systemd[1]: run-netns-ovnmeta\x2d23744514\x2d9581\x2d483b\x2dba8d\x2d38106bcd89ef.mount: Deactivated successfully.
Feb 19 19:29:26 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:26.720 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[3e942d04-4438-40eb-bb3f-fe1a4251c7e6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.169 186666 DEBUG nova.virt.libvirt.vif [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:25:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1723250421',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1723250421',id=4,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:26:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='84d22a8926d9401eb98cf092c0899a62',ramdisk_id='',reservation_id='r-23170twp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1567565925',owner_user_name='tempest-TestExecuteActionsViaActuator-1567565925-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:26:40Z,user_data=None,user_id='af924faf672a45b4b8708466af6eeb12',uuid=df77e346-76b1-4b06-8611-44d3ac9fc3ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "address": "fa:16:3e:0c:fd:c8", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape43829dc-75", "ovs_interfaceid": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.169 186666 DEBUG nova.network.os_vif_util [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converting VIF {"id": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "address": "fa:16:3e:0c:fd:c8", "network": {"id": "23744514-9581-483b-ba8d-38106bcd89ef", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-294879570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0de9f29fcea461c9f09c667b54fe8fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape43829dc-75", "ovs_interfaceid": "e43829dc-7578-4c37-87d3-5cbc96a2767f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.170 186666 DEBUG nova.network.os_vif_util [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0c:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=e43829dc-7578-4c37-87d3-5cbc96a2767f,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape43829dc-75') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.170 186666 DEBUG os_vif [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=e43829dc-7578-4c37-87d3-5cbc96a2767f,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape43829dc-75') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.171 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.171 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape43829dc-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.172 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.173 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.174 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.174 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e2ec8905-8a44-4544-9969-bc83ef4619a6) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.174 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.175 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.176 186666 INFO os_vif [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=e43829dc-7578-4c37-87d3-5cbc96a2767f,network=Network(23744514-9581-483b-ba8d-38106bcd89ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape43829dc-75')
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.177 186666 INFO nova.virt.libvirt.driver [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Deleting instance files /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef_del
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.178 186666 INFO nova.virt.libvirt.driver [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Deletion of /var/lib/nova/instances/df77e346-76b1-4b06-8611-44d3ac9fc3ef_del complete
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.342 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.687 186666 INFO nova.compute.manager [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Took 1.28 seconds to destroy the instance on the hypervisor.
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.688 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.688 186666 DEBUG nova.compute.manager [-] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.688 186666 DEBUG nova.network.neutron [-] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:29:27 compute-0 nova_compute[186662]: 2026-02-19 19:29:27.689 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:29:28 compute-0 nova_compute[186662]: 2026-02-19 19:29:28.581 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:29:28 compute-0 nova_compute[186662]: 2026-02-19 19:29:28.749 186666 DEBUG nova.compute.manager [req-0a0c729c-92f3-4560-ac23-5bea80599c11 req-4bcc5589-94e0-4829-988c-f8ddc66de7a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received event network-vif-unplugged-e43829dc-7578-4c37-87d3-5cbc96a2767f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:28 compute-0 nova_compute[186662]: 2026-02-19 19:29:28.749 186666 DEBUG oslo_concurrency.lockutils [req-0a0c729c-92f3-4560-ac23-5bea80599c11 req-4bcc5589-94e0-4829-988c-f8ddc66de7a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:28 compute-0 nova_compute[186662]: 2026-02-19 19:29:28.749 186666 DEBUG oslo_concurrency.lockutils [req-0a0c729c-92f3-4560-ac23-5bea80599c11 req-4bcc5589-94e0-4829-988c-f8ddc66de7a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:28 compute-0 nova_compute[186662]: 2026-02-19 19:29:28.749 186666 DEBUG oslo_concurrency.lockutils [req-0a0c729c-92f3-4560-ac23-5bea80599c11 req-4bcc5589-94e0-4829-988c-f8ddc66de7a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:28 compute-0 nova_compute[186662]: 2026-02-19 19:29:28.750 186666 DEBUG nova.compute.manager [req-0a0c729c-92f3-4560-ac23-5bea80599c11 req-4bcc5589-94e0-4829-988c-f8ddc66de7a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] No waiting events found dispatching network-vif-unplugged-e43829dc-7578-4c37-87d3-5cbc96a2767f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:29:28 compute-0 nova_compute[186662]: 2026-02-19 19:29:28.750 186666 DEBUG nova.compute.manager [req-0a0c729c-92f3-4560-ac23-5bea80599c11 req-4bcc5589-94e0-4829-988c-f8ddc66de7a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received event network-vif-unplugged-e43829dc-7578-4c37-87d3-5cbc96a2767f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:29:29 compute-0 podman[210689]: 2026-02-19 19:29:29.323941386 +0000 UTC m=+0.095954605 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:29:29 compute-0 nova_compute[186662]: 2026-02-19 19:29:29.653 186666 DEBUG nova.compute.manager [req-c1d7892b-ec06-4c7f-9fb6-9ec9d594c5c6 req-0e7652a8-7315-4a88-8503-de214a8c36d6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Received event network-vif-deleted-e43829dc-7578-4c37-87d3-5cbc96a2767f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:29:29 compute-0 nova_compute[186662]: 2026-02-19 19:29:29.654 186666 INFO nova.compute.manager [req-c1d7892b-ec06-4c7f-9fb6-9ec9d594c5c6 req-0e7652a8-7315-4a88-8503-de214a8c36d6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Neutron deleted interface e43829dc-7578-4c37-87d3-5cbc96a2767f; detaching it from the instance and deleting it from the info cache
Feb 19 19:29:29 compute-0 nova_compute[186662]: 2026-02-19 19:29:29.654 186666 DEBUG nova.network.neutron [req-c1d7892b-ec06-4c7f-9fb6-9ec9d594c5c6 req-0e7652a8-7315-4a88-8503-de214a8c36d6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:29:29 compute-0 podman[196025]: time="2026-02-19T19:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:29:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:29:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2194 "" "Go-http-client/1.1"
Feb 19 19:29:30 compute-0 nova_compute[186662]: 2026-02-19 19:29:30.107 186666 DEBUG nova.network.neutron [-] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:29:30 compute-0 nova_compute[186662]: 2026-02-19 19:29:30.162 186666 DEBUG nova.compute.manager [req-c1d7892b-ec06-4c7f-9fb6-9ec9d594c5c6 req-0e7652a8-7315-4a88-8503-de214a8c36d6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Detach interface failed, port_id=e43829dc-7578-4c37-87d3-5cbc96a2767f, reason: Instance df77e346-76b1-4b06-8611-44d3ac9fc3ef could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:29:30 compute-0 nova_compute[186662]: 2026-02-19 19:29:30.613 186666 INFO nova.compute.manager [-] [instance: df77e346-76b1-4b06-8611-44d3ac9fc3ef] Took 2.93 seconds to deallocate network for instance.
Feb 19 19:29:31 compute-0 nova_compute[186662]: 2026-02-19 19:29:31.134 186666 DEBUG oslo_concurrency.lockutils [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:31 compute-0 nova_compute[186662]: 2026-02-19 19:29:31.135 186666 DEBUG oslo_concurrency.lockutils [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:31 compute-0 nova_compute[186662]: 2026-02-19 19:29:31.195 186666 DEBUG nova.compute.provider_tree [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:29:31 compute-0 openstack_network_exporter[198916]: ERROR   19:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:29:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:29:31 compute-0 openstack_network_exporter[198916]: ERROR   19:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:29:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:29:31 compute-0 nova_compute[186662]: 2026-02-19 19:29:31.702 186666 DEBUG nova.scheduler.client.report [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:29:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:32.124 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:29:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:32.124 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:29:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:32.125 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:32 compute-0 nova_compute[186662]: 2026-02-19 19:29:32.176 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:32 compute-0 nova_compute[186662]: 2026-02-19 19:29:32.211 186666 DEBUG oslo_concurrency.lockutils [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.076s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:32 compute-0 nova_compute[186662]: 2026-02-19 19:29:32.256 186666 INFO nova.scheduler.client.report [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Deleted allocations for instance df77e346-76b1-4b06-8611-44d3ac9fc3ef
Feb 19 19:29:32 compute-0 nova_compute[186662]: 2026-02-19 19:29:32.343 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:33 compute-0 nova_compute[186662]: 2026-02-19 19:29:33.306 186666 DEBUG oslo_concurrency.lockutils [None req-b8a20858-8637-4c43-bd54-5667d4bca2a6 af924faf672a45b4b8708466af6eeb12 84d22a8926d9401eb98cf092c0899a62 - - default default] Lock "df77e346-76b1-4b06-8611-44d3ac9fc3ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.430s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:29:33 compute-0 podman[210711]: 2026-02-19 19:29:33.32488562 +0000 UTC m=+0.092767697 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 19 19:29:35 compute-0 podman[210735]: 2026-02-19 19:29:35.306949116 +0000 UTC m=+0.088676890 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:29:37 compute-0 nova_compute[186662]: 2026-02-19 19:29:37.178 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:37 compute-0 nova_compute[186662]: 2026-02-19 19:29:37.345 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:40 compute-0 nova_compute[186662]: 2026-02-19 19:29:40.011 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:42 compute-0 nova_compute[186662]: 2026-02-19 19:29:42.179 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:42 compute-0 nova_compute[186662]: 2026-02-19 19:29:42.347 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:44 compute-0 podman[210763]: 2026-02-19 19:29:44.300777944 +0000 UTC m=+0.069571760 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:29:44 compute-0 sshd-session[210761]: Invalid user n8n from 189.165.79.177 port 46602
Feb 19 19:29:44 compute-0 sshd-session[210761]: Received disconnect from 189.165.79.177 port 46602:11: Bye Bye [preauth]
Feb 19 19:29:44 compute-0 sshd-session[210761]: Disconnected from invalid user n8n 189.165.79.177 port 46602 [preauth]
Feb 19 19:29:47 compute-0 nova_compute[186662]: 2026-02-19 19:29:47.182 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:47 compute-0 nova_compute[186662]: 2026-02-19 19:29:47.349 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:51 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:51.171 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:3a:9b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a543a13-7551-4f54-8112-48a9ec62e1bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0db250c0dbdb430dab926e3c448a8a79', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6837b7ea-7979-471d-8324-067c4fb4f754, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7240d498-9bfd-45c4-907f-9700c860bcaf) old=Port_Binding(mac=['fa:16:3e:e8:3a:9b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a543a13-7551-4f54-8112-48a9ec62e1bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0db250c0dbdb430dab926e3c448a8a79', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:29:51 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:51.172 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7240d498-9bfd-45c4-907f-9700c860bcaf in datapath 0a543a13-7551-4f54-8112-48a9ec62e1bb updated
Feb 19 19:29:51 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:51.173 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a543a13-7551-4f54-8112-48a9ec62e1bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:29:51 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:51.174 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[9f51e796-0147-460a-a3c0-cefec24c89d5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:52 compute-0 nova_compute[186662]: 2026-02-19 19:29:52.184 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:52 compute-0 nova_compute[186662]: 2026-02-19 19:29:52.352 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:57 compute-0 nova_compute[186662]: 2026-02-19 19:29:57.186 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:57 compute-0 nova_compute[186662]: 2026-02-19 19:29:57.353 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:29:58 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:58.896 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:80:65 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-85a1abf0-7f2f-4bba-90a0-e6da53bc9a51', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85a1abf0-7f2f-4bba-90a0-e6da53bc9a51', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4336eca41984cf8bd35f5039db90e19', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=051dea4c-f6ec-43b3-a431-2b7eb21b5b29, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=059cb0f6-2204-4358-bec0-9ea713da65f5) old=Port_Binding(mac=['fa:16:3e:9b:80:65'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-85a1abf0-7f2f-4bba-90a0-e6da53bc9a51', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85a1abf0-7f2f-4bba-90a0-e6da53bc9a51', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4336eca41984cf8bd35f5039db90e19', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:29:58 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:58.896 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 059cb0f6-2204-4358-bec0-9ea713da65f5 in datapath 85a1abf0-7f2f-4bba-90a0-e6da53bc9a51 updated
Feb 19 19:29:58 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:58.897 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 85a1abf0-7f2f-4bba-90a0-e6da53bc9a51, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:29:58 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:29:58.897 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a7554b40-6e03-401f-85a3-eb484bd92dab]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:29:59 compute-0 podman[196025]: time="2026-02-19T19:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:29:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:29:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2196 "" "Go-http-client/1.1"
Feb 19 19:30:00 compute-0 podman[210788]: 2026-02-19 19:30:00.263562069 +0000 UTC m=+0.039069969 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 19 19:30:01 compute-0 openstack_network_exporter[198916]: ERROR   19:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:30:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:30:01 compute-0 openstack_network_exporter[198916]: ERROR   19:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:30:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:30:02 compute-0 nova_compute[186662]: 2026-02-19 19:30:02.189 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:02 compute-0 nova_compute[186662]: 2026-02-19 19:30:02.358 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:04 compute-0 podman[210808]: 2026-02-19 19:30:04.27725502 +0000 UTC m=+0.050747720 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, name=ubi9/ubi-minimal)
Feb 19 19:30:06 compute-0 podman[210830]: 2026-02-19 19:30:06.288331041 +0000 UTC m=+0.068314331 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 19 19:30:07 compute-0 nova_compute[186662]: 2026-02-19 19:30:07.191 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:07 compute-0 nova_compute[186662]: 2026-02-19 19:30:07.360 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:12 compute-0 nova_compute[186662]: 2026-02-19 19:30:12.193 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:12 compute-0 nova_compute[186662]: 2026-02-19 19:30:12.362 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:12 compute-0 nova_compute[186662]: 2026-02-19 19:30:12.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:30:12 compute-0 nova_compute[186662]: 2026-02-19 19:30:12.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:30:12 compute-0 nova_compute[186662]: 2026-02-19 19:30:12.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:30:12 compute-0 nova_compute[186662]: 2026-02-19 19:30:12.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:30:12 compute-0 nova_compute[186662]: 2026-02-19 19:30:12.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:30:12 compute-0 nova_compute[186662]: 2026-02-19 19:30:12.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:30:12 compute-0 nova_compute[186662]: 2026-02-19 19:30:12.576 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Feb 19 19:30:12 compute-0 ovn_controller[96653]: 2026-02-19T19:30:12Z|00094|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Feb 19 19:30:14 compute-0 nova_compute[186662]: 2026-02-19 19:30:14.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:30:14 compute-0 nova_compute[186662]: 2026-02-19 19:30:14.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:30:14 compute-0 nova_compute[186662]: 2026-02-19 19:30:14.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:30:15 compute-0 podman[210856]: 2026-02-19 19:30:15.282515585 +0000 UTC m=+0.057690730 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:30:16 compute-0 nova_compute[186662]: 2026-02-19 19:30:16.171 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:30:16 compute-0 nova_compute[186662]: 2026-02-19 19:30:16.172 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:30:16 compute-0 nova_compute[186662]: 2026-02-19 19:30:16.681 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:30:16 compute-0 nova_compute[186662]: 2026-02-19 19:30:16.681 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:30:16 compute-0 nova_compute[186662]: 2026-02-19 19:30:16.681 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:30:16 compute-0 nova_compute[186662]: 2026-02-19 19:30:16.681 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:30:16 compute-0 nova_compute[186662]: 2026-02-19 19:30:16.837 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:30:16 compute-0 nova_compute[186662]: 2026-02-19 19:30:16.839 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:30:16 compute-0 nova_compute[186662]: 2026-02-19 19:30:16.856 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:30:16 compute-0 nova_compute[186662]: 2026-02-19 19:30:16.857 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5851MB free_disk=72.97652053833008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:30:16 compute-0 nova_compute[186662]: 2026-02-19 19:30:16.857 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:30:16 compute-0 nova_compute[186662]: 2026-02-19 19:30:16.858 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:30:17 compute-0 nova_compute[186662]: 2026-02-19 19:30:17.195 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:17 compute-0 nova_compute[186662]: 2026-02-19 19:30:17.363 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:17 compute-0 nova_compute[186662]: 2026-02-19 19:30:17.914 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:30:17 compute-0 nova_compute[186662]: 2026-02-19 19:30:17.915 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:30:16 up  1:01,  0 user,  load average: 0.39, 0.35, 0.37\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:30:17 compute-0 nova_compute[186662]: 2026-02-19 19:30:17.939 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:30:18 compute-0 nova_compute[186662]: 2026-02-19 19:30:18.446 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:30:18 compute-0 nova_compute[186662]: 2026-02-19 19:30:18.954 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:30:18 compute-0 nova_compute[186662]: 2026-02-19 19:30:18.955 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:30:18 compute-0 nova_compute[186662]: 2026-02-19 19:30:18.955 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:30:18 compute-0 nova_compute[186662]: 2026-02-19 19:30:18.956 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Feb 19 19:30:19 compute-0 nova_compute[186662]: 2026-02-19 19:30:19.461 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Feb 19 19:30:21 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 19 19:30:22 compute-0 nova_compute[186662]: 2026-02-19 19:30:22.197 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:22 compute-0 sshd-session[210884]: Invalid user claude from 197.211.55.20 port 46574
Feb 19 19:30:22 compute-0 nova_compute[186662]: 2026-02-19 19:30:22.365 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:22 compute-0 sshd-session[210884]: Received disconnect from 197.211.55.20 port 46574:11: Bye Bye [preauth]
Feb 19 19:30:22 compute-0 sshd-session[210884]: Disconnected from invalid user claude 197.211.55.20 port 46574 [preauth]
Feb 19 19:30:25 compute-0 nova_compute[186662]: 2026-02-19 19:30:25.596 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:30:25 compute-0 nova_compute[186662]: 2026-02-19 19:30:25.596 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:30:26 compute-0 nova_compute[186662]: 2026-02-19 19:30:26.101 186666 DEBUG nova.compute.manager [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Feb 19 19:30:26 compute-0 nova_compute[186662]: 2026-02-19 19:30:26.642 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:30:26 compute-0 nova_compute[186662]: 2026-02-19 19:30:26.642 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:30:26 compute-0 nova_compute[186662]: 2026-02-19 19:30:26.646 186666 DEBUG nova.virt.hardware [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:30:26 compute-0 nova_compute[186662]: 2026-02-19 19:30:26.647 186666 INFO nova.compute.claims [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:30:27 compute-0 nova_compute[186662]: 2026-02-19 19:30:27.198 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:27 compute-0 nova_compute[186662]: 2026-02-19 19:30:27.364 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:27 compute-0 nova_compute[186662]: 2026-02-19 19:30:27.706 186666 DEBUG nova.compute.provider_tree [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:30:28 compute-0 nova_compute[186662]: 2026-02-19 19:30:28.212 186666 DEBUG nova.scheduler.client.report [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:30:28 compute-0 nova_compute[186662]: 2026-02-19 19:30:28.721 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.079s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:30:28 compute-0 nova_compute[186662]: 2026-02-19 19:30:28.722 186666 DEBUG nova.compute.manager [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Feb 19 19:30:29 compute-0 nova_compute[186662]: 2026-02-19 19:30:29.232 186666 DEBUG nova.compute.manager [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Feb 19 19:30:29 compute-0 nova_compute[186662]: 2026-02-19 19:30:29.232 186666 DEBUG nova.network.neutron [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Feb 19 19:30:29 compute-0 nova_compute[186662]: 2026-02-19 19:30:29.233 186666 WARNING neutronclient.v2_0.client [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:30:29 compute-0 nova_compute[186662]: 2026-02-19 19:30:29.233 186666 WARNING neutronclient.v2_0.client [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:30:29 compute-0 nova_compute[186662]: 2026-02-19 19:30:29.738 186666 INFO nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 19 19:30:29 compute-0 podman[196025]: time="2026-02-19T19:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:30:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:30:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2188 "" "Go-http-client/1.1"
Feb 19 19:30:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:29.907 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:30:29 compute-0 nova_compute[186662]: 2026-02-19 19:30:29.908 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:29.909 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:30:30 compute-0 nova_compute[186662]: 2026-02-19 19:30:30.034 186666 DEBUG nova.network.neutron [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Successfully created port: 3a879a6d-2e72-4536-91fc-3d5cf7c630dd _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Feb 19 19:30:30 compute-0 nova_compute[186662]: 2026-02-19 19:30:30.246 186666 DEBUG nova.compute.manager [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Feb 19 19:30:30 compute-0 nova_compute[186662]: 2026-02-19 19:30:30.818 186666 DEBUG nova.network.neutron [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Successfully updated port: 3a879a6d-2e72-4536-91fc-3d5cf7c630dd _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Feb 19 19:30:30 compute-0 nova_compute[186662]: 2026-02-19 19:30:30.888 186666 DEBUG nova.compute.manager [req-197f6092-93ec-431e-ab23-8fe18702b67d req-8bf940c6-b3b8-40fe-95cd-f9d42cd94d8c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-changed-3a879a6d-2e72-4536-91fc-3d5cf7c630dd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:30:30 compute-0 nova_compute[186662]: 2026-02-19 19:30:30.888 186666 DEBUG nova.compute.manager [req-197f6092-93ec-431e-ab23-8fe18702b67d req-8bf940c6-b3b8-40fe-95cd-f9d42cd94d8c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Refreshing instance network info cache due to event network-changed-3a879a6d-2e72-4536-91fc-3d5cf7c630dd. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:30:30 compute-0 nova_compute[186662]: 2026-02-19 19:30:30.889 186666 DEBUG oslo_concurrency.lockutils [req-197f6092-93ec-431e-ab23-8fe18702b67d req-8bf940c6-b3b8-40fe-95cd-f9d42cd94d8c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-c3ef5f5e-bbad-410e-b927-88f69b1cff5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:30:30 compute-0 nova_compute[186662]: 2026-02-19 19:30:30.889 186666 DEBUG oslo_concurrency.lockutils [req-197f6092-93ec-431e-ab23-8fe18702b67d req-8bf940c6-b3b8-40fe-95cd-f9d42cd94d8c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-c3ef5f5e-bbad-410e-b927-88f69b1cff5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:30:30 compute-0 nova_compute[186662]: 2026-02-19 19:30:30.889 186666 DEBUG nova.network.neutron [req-197f6092-93ec-431e-ab23-8fe18702b67d req-8bf940c6-b3b8-40fe-95cd-f9d42cd94d8c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Refreshing network info cache for port 3a879a6d-2e72-4536-91fc-3d5cf7c630dd _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.263 186666 DEBUG nova.compute.manager [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.264 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.264 186666 INFO nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Creating image(s)
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.264 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Acquiring lock "/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.264 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Lock "/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.265 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Lock "/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.265 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.268 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.269 186666 DEBUG oslo_concurrency.processutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:30:31 compute-0 podman[210889]: 2026-02-19 19:30:31.287753213 +0000 UTC m=+0.068001777 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.309 186666 DEBUG oslo_concurrency.processutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.310 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.311 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.311 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.314 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.314 186666 DEBUG oslo_concurrency.processutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.325 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Acquiring lock "refresh_cache-c3ef5f5e-bbad-410e-b927-88f69b1cff5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.371 186666 DEBUG oslo_concurrency.processutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.372 186666 DEBUG oslo_concurrency.processutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.395 186666 WARNING neutronclient.v2_0.client [req-197f6092-93ec-431e-ab23-8fe18702b67d req-8bf940c6-b3b8-40fe-95cd-f9d42cd94d8c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.398 186666 DEBUG oslo_concurrency.processutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk 1073741824" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.399 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.399 186666 DEBUG oslo_concurrency.processutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:30:31 compute-0 openstack_network_exporter[198916]: ERROR   19:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:30:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:30:31 compute-0 openstack_network_exporter[198916]: ERROR   19:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:30:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.452 186666 DEBUG oslo_concurrency.processutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.453 186666 DEBUG nova.virt.disk.api [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Checking if we can resize image /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.453 186666 DEBUG oslo_concurrency.processutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.505 186666 DEBUG oslo_concurrency.processutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.506 186666 DEBUG nova.virt.disk.api [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Cannot resize image /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.506 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.506 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Ensure instance console log exists: /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.507 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.507 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.507 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.587 186666 DEBUG nova.network.neutron [req-197f6092-93ec-431e-ab23-8fe18702b67d req-8bf940c6-b3b8-40fe-95cd-f9d42cd94d8c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:30:31 compute-0 nova_compute[186662]: 2026-02-19 19:30:31.702 186666 DEBUG nova.network.neutron [req-197f6092-93ec-431e-ab23-8fe18702b67d req-8bf940c6-b3b8-40fe-95cd-f9d42cd94d8c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:30:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:32.126 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:30:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:32.126 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:30:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:32.127 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:30:32 compute-0 nova_compute[186662]: 2026-02-19 19:30:32.201 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:32 compute-0 nova_compute[186662]: 2026-02-19 19:30:32.208 186666 DEBUG oslo_concurrency.lockutils [req-197f6092-93ec-431e-ab23-8fe18702b67d req-8bf940c6-b3b8-40fe-95cd-f9d42cd94d8c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-c3ef5f5e-bbad-410e-b927-88f69b1cff5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:30:32 compute-0 nova_compute[186662]: 2026-02-19 19:30:32.209 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Acquired lock "refresh_cache-c3ef5f5e-bbad-410e-b927-88f69b1cff5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:30:32 compute-0 nova_compute[186662]: 2026-02-19 19:30:32.209 186666 DEBUG nova.network.neutron [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:30:32 compute-0 nova_compute[186662]: 2026-02-19 19:30:32.368 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:33 compute-0 nova_compute[186662]: 2026-02-19 19:30:33.320 186666 DEBUG nova.network.neutron [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:30:33 compute-0 nova_compute[186662]: 2026-02-19 19:30:33.576 186666 WARNING neutronclient.v2_0.client [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:30:33 compute-0 nova_compute[186662]: 2026-02-19 19:30:33.735 186666 DEBUG nova.network.neutron [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Updating instance_info_cache with network_info: [{"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.241 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Releasing lock "refresh_cache-c3ef5f5e-bbad-410e-b927-88f69b1cff5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.242 186666 DEBUG nova.compute.manager [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Instance network_info: |[{"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.246 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Start _get_guest_xml network_info=[{"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.250 186666 WARNING nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.252 186666 DEBUG nova.virt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-2040083590', uuid='c3ef5f5e-bbad-410e-b927-88f69b1cff5b'), owner=OwnerMeta(userid='856c67cb702b4cd7a27eb3f75ca31608', username='tempest-TestExecuteBasicStrategy-717047797-project-admin', projectid='c4336eca41984cf8bd35f5039db90e19', projectname='tempest-TestExecuteBasicStrategy-717047797'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771529434.2524803) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.257 186666 DEBUG nova.virt.libvirt.host [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.258 186666 DEBUG nova.virt.libvirt.host [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.262 186666 DEBUG nova.virt.libvirt.host [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.263 186666 DEBUG nova.virt.libvirt.host [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.265 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.265 186666 DEBUG nova.virt.hardware [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:33Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.266 186666 DEBUG nova.virt.hardware [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.266 186666 DEBUG nova.virt.hardware [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.267 186666 DEBUG nova.virt.hardware [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.267 186666 DEBUG nova.virt.hardware [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.268 186666 DEBUG nova.virt.hardware [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.268 186666 DEBUG nova.virt.hardware [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.269 186666 DEBUG nova.virt.hardware [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.269 186666 DEBUG nova.virt.hardware [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.269 186666 DEBUG nova.virt.hardware [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.270 186666 DEBUG nova.virt.hardware [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.275 186666 DEBUG nova.virt.libvirt.vif [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:30:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-2040083590',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-2040083590',id=11,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4336eca41984cf8bd35f5039db90e19',ramdisk_id='',reservation_id='r-wobtfndh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-717047797',owner_user_name='tempest-TestExecuteBasicStrategy-717047797-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:30:30Z,user_data=None,user_id='856c67cb702b4cd7a27eb3f75ca31608',uuid=c3ef5f5e-bbad-410e-b927-88f69b1cff5b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.276 186666 DEBUG nova.network.os_vif_util [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Converting VIF {"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.277 186666 DEBUG nova.network.os_vif_util [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:a5:99,bridge_name='br-int',has_traffic_filtering=True,id=3a879a6d-2e72-4536-91fc-3d5cf7c630dd,network=Network(0a543a13-7551-4f54-8112-48a9ec62e1bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a879a6d-2e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.278 186666 DEBUG nova.objects.instance [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Lazy-loading 'pci_devices' on Instance uuid c3ef5f5e-bbad-410e-b927-88f69b1cff5b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.787 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:30:34 compute-0 nova_compute[186662]:   <uuid>c3ef5f5e-bbad-410e-b927-88f69b1cff5b</uuid>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   <name>instance-0000000b</name>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   <memory>131072</memory>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteBasicStrategy-server-2040083590</nova:name>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:30:34</nova:creationTime>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:30:34 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:30:34 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:user uuid="856c67cb702b4cd7a27eb3f75ca31608">tempest-TestExecuteBasicStrategy-717047797-project-admin</nova:user>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:project uuid="c4336eca41984cf8bd35f5039db90e19">tempest-TestExecuteBasicStrategy-717047797</nova:project>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         <nova:port uuid="3a879a6d-2e72-4536-91fc-3d5cf7c630dd">
Feb 19 19:30:34 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <system>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <entry name="serial">c3ef5f5e-bbad-410e-b927-88f69b1cff5b</entry>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <entry name="uuid">c3ef5f5e-bbad-410e-b927-88f69b1cff5b</entry>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     </system>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   <os>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   </os>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   <features>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   </features>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk.config"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:8e:a5:99"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <target dev="tap3a879a6d-2e"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/console.log" append="off"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <video>
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     </video>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:30:34 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:30:34 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:30:34 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:30:34 compute-0 nova_compute[186662]: </domain>
Feb 19 19:30:34 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.789 186666 DEBUG nova.compute.manager [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Preparing to wait for external event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.789 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.790 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.791 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.792 186666 DEBUG nova.virt.libvirt.vif [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:30:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-2040083590',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-2040083590',id=11,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4336eca41984cf8bd35f5039db90e19',ramdisk_id='',reservation_id='r-wobtfndh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-717047797',owner_user_name='tempest-TestExecuteBasicStrategy-717047797-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:30:30Z,user_data=None,user_id='856c67cb702b4cd7a27eb3f75ca31608',uuid=c3ef5f5e-bbad-410e-b927-88f69b1cff5b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.793 186666 DEBUG nova.network.os_vif_util [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Converting VIF {"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.794 186666 DEBUG nova.network.os_vif_util [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:a5:99,bridge_name='br-int',has_traffic_filtering=True,id=3a879a6d-2e72-4536-91fc-3d5cf7c630dd,network=Network(0a543a13-7551-4f54-8112-48a9ec62e1bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a879a6d-2e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.795 186666 DEBUG os_vif [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:a5:99,bridge_name='br-int',has_traffic_filtering=True,id=3a879a6d-2e72-4536-91fc-3d5cf7c630dd,network=Network(0a543a13-7551-4f54-8112-48a9ec62e1bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a879a6d-2e') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.796 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.797 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.798 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.799 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.800 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1d223cc5-d6c7-5a37-b974-59df25dab2ef', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.801 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.802 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.806 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.807 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a879a6d-2e, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.808 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3a879a6d-2e, col_values=(('qos', UUID('dd92cae3-6d6a-46e1-ae22-1fa92bb16aac')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.808 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3a879a6d-2e, col_values=(('external_ids', {'iface-id': '3a879a6d-2e72-4536-91fc-3d5cf7c630dd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8e:a5:99', 'vm-uuid': 'c3ef5f5e-bbad-410e-b927-88f69b1cff5b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.810 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:34 compute-0 NetworkManager[56519]: <info>  [1771529434.8113] manager: (tap3a879a6d-2e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.815 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.816 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:34 compute-0 nova_compute[186662]: 2026-02-19 19:30:34.817 186666 INFO os_vif [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:a5:99,bridge_name='br-int',has_traffic_filtering=True,id=3a879a6d-2e72-4536-91fc-3d5cf7c630dd,network=Network(0a543a13-7551-4f54-8112-48a9ec62e1bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a879a6d-2e')
Feb 19 19:30:35 compute-0 podman[210926]: 2026-02-19 19:30:35.270549528 +0000 UTC m=+0.050916866 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1770267347, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 19 19:30:36 compute-0 nova_compute[186662]: 2026-02-19 19:30:36.349 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:30:36 compute-0 nova_compute[186662]: 2026-02-19 19:30:36.349 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:30:36 compute-0 nova_compute[186662]: 2026-02-19 19:30:36.349 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] No VIF found with MAC fa:16:3e:8e:a5:99, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:30:36 compute-0 nova_compute[186662]: 2026-02-19 19:30:36.350 186666 INFO nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Using config drive
Feb 19 19:30:36 compute-0 nova_compute[186662]: 2026-02-19 19:30:36.860 186666 WARNING neutronclient.v2_0.client [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.046 186666 INFO nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Creating config drive at /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk.config
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.050 186666 DEBUG oslo_concurrency.processutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp8ef5g5o7 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.172 186666 DEBUG oslo_concurrency.processutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp8ef5g5o7" returned: 0 in 0.122s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:30:37 compute-0 kernel: tap3a879a6d-2e: entered promiscuous mode
Feb 19 19:30:37 compute-0 NetworkManager[56519]: <info>  [1771529437.2285] manager: (tap3a879a6d-2e): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Feb 19 19:30:37 compute-0 ovn_controller[96653]: 2026-02-19T19:30:37Z|00095|binding|INFO|Claiming lport 3a879a6d-2e72-4536-91fc-3d5cf7c630dd for this chassis.
Feb 19 19:30:37 compute-0 ovn_controller[96653]: 2026-02-19T19:30:37Z|00096|binding|INFO|3a879a6d-2e72-4536-91fc-3d5cf7c630dd: Claiming fa:16:3e:8e:a5:99 10.100.0.11
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.229 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.248 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:a5:99 10.100.0.11'], port_security=['fa:16:3e:8e:a5:99 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c3ef5f5e-bbad-410e-b927-88f69b1cff5b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a543a13-7551-4f54-8112-48a9ec62e1bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4336eca41984cf8bd35f5039db90e19', 'neutron:revision_number': '4', 'neutron:security_group_ids': '08ac8fcf-f732-4a9b-bd03-253fb11db4ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6837b7ea-7979-471d-8324-067c4fb4f754, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=3a879a6d-2e72-4536-91fc-3d5cf7c630dd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.249 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 3a879a6d-2e72-4536-91fc-3d5cf7c630dd in datapath 0a543a13-7551-4f54-8112-48a9ec62e1bb bound to our chassis
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.249 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a543a13-7551-4f54-8112-48a9ec62e1bb
Feb 19 19:30:37 compute-0 systemd-machined[156014]: New machine qemu-8-instance-0000000b.
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.253 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:37 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000b.
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.261 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[495dcd6c-2ccc-4011-951a-594ae3daf2fe]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.261 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0a543a13-71 in ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.263 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0a543a13-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.263 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b58ee7fc-2a66-45b9-9e84-27d0c818170d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 ovn_controller[96653]: 2026-02-19T19:30:37Z|00097|binding|INFO|Setting lport 3a879a6d-2e72-4536-91fc-3d5cf7c630dd ovn-installed in OVS
Feb 19 19:30:37 compute-0 ovn_controller[96653]: 2026-02-19T19:30:37Z|00098|binding|INFO|Setting lport 3a879a6d-2e72-4536-91fc-3d5cf7c630dd up in Southbound
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.264 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[149f03a5-6de5-4733-91da-12f044204d91]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.265 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:37 compute-0 systemd-udevd[210976]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.270 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[adcd1e91-5257-4792-a658-8c6e44979c57]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 NetworkManager[56519]: <info>  [1771529437.2806] device (tap3a879a6d-2e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:30:37 compute-0 NetworkManager[56519]: <info>  [1771529437.2812] device (tap3a879a6d-2e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.292 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[be33232b-55aa-448b-9530-88db93a3a9fc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.317 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[fad63b1c-690e-4b15-85de-422b8f5ba3f1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.320 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[92325855-1a34-4cb5-8157-fd673573d942]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 systemd-udevd[210986]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:30:37 compute-0 NetworkManager[56519]: <info>  [1771529437.3212] manager: (tap0a543a13-70): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Feb 19 19:30:37 compute-0 podman[210956]: 2026-02-19 19:30:37.328456844 +0000 UTC m=+0.098508010 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.342 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[3f0f1ca7-0633-4856-93f1-66c2c6b435b4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.345 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[0c2db38d-1ae9-4ed2-aa3a-047de39ece84]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 NetworkManager[56519]: <info>  [1771529437.3585] device (tap0a543a13-70): carrier: link connected
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.362 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[7dcebafd-942a-4c1e-9f21-308bedaba439]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.369 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.374 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[5f49d7b5-e45d-486f-9c8b-b0eb0e43c25b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a543a13-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:3a:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 370692, 'reachable_time': 22290, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211022, 'error': None, 'target': 'ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.384 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[acd7c3b8-8ba3-48f3-93d2-975fd5efd3a5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:3a9b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 370692, 'tstamp': 370692}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211023, 'error': None, 'target': 'ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.395 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[949919b1-e57b-40c6-9d11-e33db19f02b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a543a13-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:3a:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 370692, 'reachable_time': 22290, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211024, 'error': None, 'target': 'ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.415 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae9d966-b0fe-460e-af96-5fd58f1060f8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.456 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[03c9f636-21d7-41da-9f10-cebf85eb9eb0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.457 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a543a13-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.457 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.457 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a543a13-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.459 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:37 compute-0 NetworkManager[56519]: <info>  [1771529437.4598] manager: (tap0a543a13-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Feb 19 19:30:37 compute-0 kernel: tap0a543a13-70: entered promiscuous mode
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.461 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.462 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a543a13-70, col_values=(('external_ids', {'iface-id': '7240d498-9bfd-45c4-907f-9700c860bcaf'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.463 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:37 compute-0 ovn_controller[96653]: 2026-02-19T19:30:37Z|00099|binding|INFO|Releasing lport 7240d498-9bfd-45c4-907f-9700c860bcaf from this chassis (sb_readonly=0)
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.464 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.464 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[08c4391f-9159-4b86-bfbc-413f28a40692]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.465 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a543a13-7551-4f54-8112-48a9ec62e1bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a543a13-7551-4f54-8112-48a9ec62e1bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.465 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a543a13-7551-4f54-8112-48a9ec62e1bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a543a13-7551-4f54-8112-48a9ec62e1bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.465 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 0a543a13-7551-4f54-8112-48a9ec62e1bb disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.465 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a543a13-7551-4f54-8112-48a9ec62e1bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a543a13-7551-4f54-8112-48a9ec62e1bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.466 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[8eef240b-e564-4e42-9101-8c5f8aeaf905]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.466 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a543a13-7551-4f54-8112-48a9ec62e1bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a543a13-7551-4f54-8112-48a9ec62e1bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.466 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[cae813b9-50f0-4737-b125-1f2b1954237f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.466 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-0a543a13-7551-4f54-8112-48a9ec62e1bb
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/0a543a13-7551-4f54-8112-48a9ec62e1bb.pid.haproxy
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID 0a543a13-7551-4f54-8112-48a9ec62e1bb
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.467 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb', 'env', 'PROCESS_TAG=haproxy-0a543a13-7551-4f54-8112-48a9ec62e1bb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0a543a13-7551-4f54-8112-48a9ec62e1bb.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.467 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.477 186666 DEBUG nova.compute.manager [req-5ec7aa95-5a3c-4723-8bce-b8ee185b2287 req-383404ec-b892-4e4d-85d7-b779fff4dbfd 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.477 186666 DEBUG oslo_concurrency.lockutils [req-5ec7aa95-5a3c-4723-8bce-b8ee185b2287 req-383404ec-b892-4e4d-85d7-b779fff4dbfd 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.477 186666 DEBUG oslo_concurrency.lockutils [req-5ec7aa95-5a3c-4723-8bce-b8ee185b2287 req-383404ec-b892-4e4d-85d7-b779fff4dbfd 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.478 186666 DEBUG oslo_concurrency.lockutils [req-5ec7aa95-5a3c-4723-8bce-b8ee185b2287 req-383404ec-b892-4e4d-85d7-b779fff4dbfd 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.478 186666 DEBUG nova.compute.manager [req-5ec7aa95-5a3c-4723-8bce-b8ee185b2287 req-383404ec-b892-4e4d-85d7-b779fff4dbfd 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Processing event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.571 186666 DEBUG nova.compute.manager [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.574 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.577 186666 INFO nova.virt.libvirt.driver [-] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Instance spawned successfully.
Feb 19 19:30:37 compute-0 nova_compute[186662]: 2026-02-19 19:30:37.578 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Feb 19 19:30:37 compute-0 podman[211063]: 2026-02-19 19:30:37.881268501 +0000 UTC m=+0.106700358 container create 7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260216)
Feb 19 19:30:37 compute-0 podman[211063]: 2026-02-19 19:30:37.797585608 +0000 UTC m=+0.023017485 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:30:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:30:37.910 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:30:37 compute-0 systemd[1]: Started libpod-conmon-7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb.scope.
Feb 19 19:30:37 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:30:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9a6c8dcbee83753ca300132a64a44c7ad63def23be2d96c33337d493faab278/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:30:37 compute-0 podman[211063]: 2026-02-19 19:30:37.978042798 +0000 UTC m=+0.203474695 container init 7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260216)
Feb 19 19:30:37 compute-0 podman[211063]: 2026-02-19 19:30:37.987107067 +0000 UTC m=+0.212538924 container start 7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 19 19:30:38 compute-0 neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb[211078]: [NOTICE]   (211082) : New worker (211084) forked
Feb 19 19:30:38 compute-0 neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb[211078]: [NOTICE]   (211082) : Loading success.
Feb 19 19:30:38 compute-0 nova_compute[186662]: 2026-02-19 19:30:38.109 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:30:38 compute-0 nova_compute[186662]: 2026-02-19 19:30:38.109 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:30:38 compute-0 nova_compute[186662]: 2026-02-19 19:30:38.109 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:30:38 compute-0 nova_compute[186662]: 2026-02-19 19:30:38.110 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:30:38 compute-0 nova_compute[186662]: 2026-02-19 19:30:38.110 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:30:38 compute-0 nova_compute[186662]: 2026-02-19 19:30:38.111 186666 DEBUG nova.virt.libvirt.driver [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:30:38 compute-0 nova_compute[186662]: 2026-02-19 19:30:38.621 186666 INFO nova.compute.manager [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Took 7.36 seconds to spawn the instance on the hypervisor.
Feb 19 19:30:38 compute-0 nova_compute[186662]: 2026-02-19 19:30:38.622 186666 DEBUG nova.compute.manager [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:30:39 compute-0 nova_compute[186662]: 2026-02-19 19:30:39.152 186666 INFO nova.compute.manager [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Took 12.54 seconds to build instance.
Feb 19 19:30:39 compute-0 nova_compute[186662]: 2026-02-19 19:30:39.542 186666 DEBUG nova.compute.manager [req-b9effca9-9843-49ee-8f3e-f56bc9e4587f req-7629c433-6743-4c23-b441-1528306a5f22 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:30:39 compute-0 nova_compute[186662]: 2026-02-19 19:30:39.543 186666 DEBUG oslo_concurrency.lockutils [req-b9effca9-9843-49ee-8f3e-f56bc9e4587f req-7629c433-6743-4c23-b441-1528306a5f22 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:30:39 compute-0 nova_compute[186662]: 2026-02-19 19:30:39.543 186666 DEBUG oslo_concurrency.lockutils [req-b9effca9-9843-49ee-8f3e-f56bc9e4587f req-7629c433-6743-4c23-b441-1528306a5f22 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:30:39 compute-0 nova_compute[186662]: 2026-02-19 19:30:39.543 186666 DEBUG oslo_concurrency.lockutils [req-b9effca9-9843-49ee-8f3e-f56bc9e4587f req-7629c433-6743-4c23-b441-1528306a5f22 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:30:39 compute-0 nova_compute[186662]: 2026-02-19 19:30:39.543 186666 DEBUG nova.compute.manager [req-b9effca9-9843-49ee-8f3e-f56bc9e4587f req-7629c433-6743-4c23-b441-1528306a5f22 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] No waiting events found dispatching network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:30:39 compute-0 nova_compute[186662]: 2026-02-19 19:30:39.544 186666 WARNING nova.compute.manager [req-b9effca9-9843-49ee-8f3e-f56bc9e4587f req-7629c433-6743-4c23-b441-1528306a5f22 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received unexpected event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd for instance with vm_state active and task_state None.
Feb 19 19:30:39 compute-0 nova_compute[186662]: 2026-02-19 19:30:39.659 186666 DEBUG oslo_concurrency.lockutils [None req-7935df6e-2202-4d0f-9886-36fbd3b1001d 856c67cb702b4cd7a27eb3f75ca31608 c4336eca41984cf8bd35f5039db90e19 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.063s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:30:39 compute-0 nova_compute[186662]: 2026-02-19 19:30:39.812 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:42 compute-0 nova_compute[186662]: 2026-02-19 19:30:42.371 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:44 compute-0 nova_compute[186662]: 2026-02-19 19:30:44.815 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:46 compute-0 podman[211093]: 2026-02-19 19:30:46.296595066 +0000 UTC m=+0.059608244 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:30:47 compute-0 nova_compute[186662]: 2026-02-19 19:30:47.374 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:48 compute-0 ovn_controller[96653]: 2026-02-19T19:30:48Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8e:a5:99 10.100.0.11
Feb 19 19:30:48 compute-0 ovn_controller[96653]: 2026-02-19T19:30:48Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8e:a5:99 10.100.0.11
Feb 19 19:30:49 compute-0 nova_compute[186662]: 2026-02-19 19:30:49.854 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:52 compute-0 nova_compute[186662]: 2026-02-19 19:30:52.376 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:54 compute-0 nova_compute[186662]: 2026-02-19 19:30:54.894 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:57 compute-0 nova_compute[186662]: 2026-02-19 19:30:57.595 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:30:58 compute-0 nova_compute[186662]: 2026-02-19 19:30:58.431 186666 DEBUG nova.compute.manager [None req-65087db1-8ca1-453e-862a-01f1f4a04832 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Feb 19 19:30:58 compute-0 nova_compute[186662]: 2026-02-19 19:30:58.490 186666 DEBUG nova.compute.provider_tree [None req-65087db1-8ca1-453e-862a-01f1f4a04832 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Updating resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 generation from 10 to 12 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Feb 19 19:30:59 compute-0 podman[196025]: time="2026-02-19T19:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:30:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:30:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2661 "" "Go-http-client/1.1"
Feb 19 19:30:59 compute-0 nova_compute[186662]: 2026-02-19 19:30:59.897 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:01 compute-0 openstack_network_exporter[198916]: ERROR   19:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:31:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:31:01 compute-0 openstack_network_exporter[198916]: ERROR   19:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:31:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:31:02 compute-0 podman[211137]: 2026-02-19 19:31:02.269430964 +0000 UTC m=+0.049561694 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 19 19:31:02 compute-0 nova_compute[186662]: 2026-02-19 19:31:02.596 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:04 compute-0 nova_compute[186662]: 2026-02-19 19:31:04.901 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:05 compute-0 nova_compute[186662]: 2026-02-19 19:31:05.432 186666 DEBUG nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Check if temp file /var/lib/nova/instances/tmpmainl8sh exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Feb 19 19:31:05 compute-0 nova_compute[186662]: 2026-02-19 19:31:05.445 186666 DEBUG nova.compute.manager [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmainl8sh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c3ef5f5e-bbad-410e-b927-88f69b1cff5b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Feb 19 19:31:06 compute-0 podman[211156]: 2026-02-19 19:31:06.324020715 +0000 UTC m=+0.089913954 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter)
Feb 19 19:31:07 compute-0 ovn_controller[96653]: 2026-02-19T19:31:07Z|00100|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 19 19:31:07 compute-0 nova_compute[186662]: 2026-02-19 19:31:07.600 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:08 compute-0 podman[211177]: 2026-02-19 19:31:08.297094541 +0000 UTC m=+0.075881556 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:31:09 compute-0 nova_compute[186662]: 2026-02-19 19:31:09.751 186666 DEBUG oslo_concurrency.processutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:31:09 compute-0 nova_compute[186662]: 2026-02-19 19:31:09.831 186666 DEBUG oslo_concurrency.processutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:31:09 compute-0 nova_compute[186662]: 2026-02-19 19:31:09.832 186666 DEBUG oslo_concurrency.processutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:31:09 compute-0 nova_compute[186662]: 2026-02-19 19:31:09.890 186666 DEBUG oslo_concurrency.processutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:31:09 compute-0 nova_compute[186662]: 2026-02-19 19:31:09.891 186666 DEBUG nova.compute.manager [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Preparing to wait for external event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:31:09 compute-0 nova_compute[186662]: 2026-02-19 19:31:09.892 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:09 compute-0 nova_compute[186662]: 2026-02-19 19:31:09.892 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:09 compute-0 nova_compute[186662]: 2026-02-19 19:31:09.893 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:09 compute-0 nova_compute[186662]: 2026-02-19 19:31:09.904 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:12 compute-0 nova_compute[186662]: 2026-02-19 19:31:12.652 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:12 compute-0 nova_compute[186662]: 2026-02-19 19:31:12.866 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:31:12 compute-0 nova_compute[186662]: 2026-02-19 19:31:12.866 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:31:12 compute-0 nova_compute[186662]: 2026-02-19 19:31:12.867 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:31:12 compute-0 nova_compute[186662]: 2026-02-19 19:31:12.867 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:31:12 compute-0 nova_compute[186662]: 2026-02-19 19:31:12.867 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:31:14 compute-0 nova_compute[186662]: 2026-02-19 19:31:14.908 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:15 compute-0 nova_compute[186662]: 2026-02-19 19:31:15.572 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:31:16 compute-0 nova_compute[186662]: 2026-02-19 19:31:16.087 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:31:16 compute-0 nova_compute[186662]: 2026-02-19 19:31:16.087 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:31:16 compute-0 nova_compute[186662]: 2026-02-19 19:31:16.611 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:16 compute-0 nova_compute[186662]: 2026-02-19 19:31:16.612 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:16 compute-0 nova_compute[186662]: 2026-02-19 19:31:16.612 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:16 compute-0 nova_compute[186662]: 2026-02-19 19:31:16.612 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:31:16 compute-0 podman[211211]: 2026-02-19 19:31:16.693530402 +0000 UTC m=+0.044357017 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.651 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.655 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.692 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.692 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.730 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk --force-share --output=json" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.862 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.864 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.881 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.882 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5676MB free_disk=72.94763565063477GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.882 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.882 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.931 186666 DEBUG nova.compute.manager [req-02315cdb-c7f8-4d7d-be1c-fe4c59e8cbf1 req-0c684b00-9ae4-424e-a0db-a6427497fa3b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-unplugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.932 186666 DEBUG oslo_concurrency.lockutils [req-02315cdb-c7f8-4d7d-be1c-fe4c59e8cbf1 req-0c684b00-9ae4-424e-a0db-a6427497fa3b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.932 186666 DEBUG oslo_concurrency.lockutils [req-02315cdb-c7f8-4d7d-be1c-fe4c59e8cbf1 req-0c684b00-9ae4-424e-a0db-a6427497fa3b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.932 186666 DEBUG oslo_concurrency.lockutils [req-02315cdb-c7f8-4d7d-be1c-fe4c59e8cbf1 req-0c684b00-9ae4-424e-a0db-a6427497fa3b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.933 186666 DEBUG nova.compute.manager [req-02315cdb-c7f8-4d7d-be1c-fe4c59e8cbf1 req-0c684b00-9ae4-424e-a0db-a6427497fa3b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] No event matching network-vif-unplugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd in dict_keys([('network-vif-plugged', '3a879a6d-2e72-4536-91fc-3d5cf7c630dd')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Feb 19 19:31:17 compute-0 nova_compute[186662]: 2026-02-19 19:31:17.933 186666 DEBUG nova.compute.manager [req-02315cdb-c7f8-4d7d-be1c-fe4c59e8cbf1 req-0c684b00-9ae4-424e-a0db-a6427497fa3b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-unplugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:31:18 compute-0 nova_compute[186662]: 2026-02-19 19:31:18.905 186666 INFO nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Updating resource usage from migration e63fda7d-35b6-4fe0-bd0f-5282f7989ffa
Feb 19 19:31:18 compute-0 nova_compute[186662]: 2026-02-19 19:31:18.969 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Migration e63fda7d-35b6-4fe0-bd0f-5282f7989ffa is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Feb 19 19:31:18 compute-0 nova_compute[186662]: 2026-02-19 19:31:18.969 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:31:18 compute-0 nova_compute[186662]: 2026-02-19 19:31:18.970 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:31:17 up  1:02,  0 user,  load average: 0.24, 0.32, 0.36\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_c4336eca41984cf8bd35f5039db90e19': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:31:19 compute-0 nova_compute[186662]: 2026-02-19 19:31:19.085 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:31:19 compute-0 nova_compute[186662]: 2026-02-19 19:31:19.599 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:31:19 compute-0 nova_compute[186662]: 2026-02-19 19:31:19.912 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.005 186666 DEBUG nova.compute.manager [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.006 186666 DEBUG oslo_concurrency.lockutils [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.006 186666 DEBUG oslo_concurrency.lockutils [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.006 186666 DEBUG oslo_concurrency.lockutils [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.007 186666 DEBUG nova.compute.manager [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Processing event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.007 186666 DEBUG nova.compute.manager [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-changed-3a879a6d-2e72-4536-91fc-3d5cf7c630dd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.007 186666 DEBUG nova.compute.manager [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Refreshing instance network info cache due to event network-changed-3a879a6d-2e72-4536-91fc-3d5cf7c630dd. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.007 186666 DEBUG oslo_concurrency.lockutils [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-c3ef5f5e-bbad-410e-b927-88f69b1cff5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.008 186666 DEBUG oslo_concurrency.lockutils [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-c3ef5f5e-bbad-410e-b927-88f69b1cff5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.008 186666 DEBUG nova.network.neutron [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Refreshing network info cache for port 3a879a6d-2e72-4536-91fc-3d5cf7c630dd _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.113 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.113 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.231s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.539 186666 WARNING neutronclient.v2_0.client [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.601 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.601 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.922 186666 INFO nova.compute.manager [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Took 11.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 19 19:31:20 compute-0 nova_compute[186662]: 2026-02-19 19:31:20.923 186666 DEBUG nova.compute.manager [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:31:21 compute-0 nova_compute[186662]: 2026-02-19 19:31:21.116 186666 WARNING neutronclient.v2_0.client [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:31:21 compute-0 nova_compute[186662]: 2026-02-19 19:31:21.267 186666 DEBUG nova.network.neutron [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Updated VIF entry in instance network info cache for port 3a879a6d-2e72-4536-91fc-3d5cf7c630dd. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Feb 19 19:31:21 compute-0 nova_compute[186662]: 2026-02-19 19:31:21.267 186666 DEBUG nova.network.neutron [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Updating instance_info_cache with network_info: [{"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:31:21 compute-0 nova_compute[186662]: 2026-02-19 19:31:21.434 186666 DEBUG nova.compute.manager [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmainl8sh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c3ef5f5e-bbad-410e-b927-88f69b1cff5b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(e63fda7d-35b6-4fe0-bd0f-5282f7989ffa),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.075 186666 DEBUG oslo_concurrency.lockutils [req-7cd3fbfe-d512-492c-9308-0fb11a73ed03 req-e00c2764-67f3-4baa-875c-3231f2d856fe 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-c3ef5f5e-bbad-410e-b927-88f69b1cff5b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.082 186666 DEBUG nova.objects.instance [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid c3ef5f5e-bbad-410e-b927-88f69b1cff5b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.083 186666 DEBUG nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.085 186666 DEBUG nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.085 186666 DEBUG nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.588 186666 DEBUG nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.588 186666 DEBUG nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.597 186666 DEBUG nova.virt.libvirt.vif [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:30:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-2040083590',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-2040083590',id=11,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:30:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c4336eca41984cf8bd35f5039db90e19',ramdisk_id='',reservation_id='r-wobtfndh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-717047797',owner_user_name='tempest-TestExecuteBasicStrategy-717047797-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:30:38Z,user_data=None,user_id='856c67cb702b4cd7a27eb3f75ca31608',uuid=c3ef5f5e-bbad-410e-b927-88f69b1cff5b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.598 186666 DEBUG nova.network.os_vif_util [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.599 186666 DEBUG nova.network.os_vif_util [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:a5:99,bridge_name='br-int',has_traffic_filtering=True,id=3a879a6d-2e72-4536-91fc-3d5cf7c630dd,network=Network(0a543a13-7551-4f54-8112-48a9ec62e1bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a879a6d-2e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.600 186666 DEBUG nova.virt.libvirt.migration [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Updating guest XML with vif config: <interface type="ethernet">
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <mac address="fa:16:3e:8e:a5:99"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <model type="virtio"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <mtu size="1442"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <target dev="tap3a879a6d-2e"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]: </interface>
Feb 19 19:31:22 compute-0 nova_compute[186662]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.601 186666 DEBUG nova.virt.libvirt.migration [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <name>instance-0000000b</name>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <uuid>c3ef5f5e-bbad-410e-b927-88f69b1cff5b</uuid>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteBasicStrategy-server-2040083590</nova:name>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:30:34</nova:creationTime>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:31:22 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:31:22 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:user uuid="856c67cb702b4cd7a27eb3f75ca31608">tempest-TestExecuteBasicStrategy-717047797-project-admin</nova:user>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:project uuid="c4336eca41984cf8bd35f5039db90e19">tempest-TestExecuteBasicStrategy-717047797</nova:project>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:port uuid="3a879a6d-2e72-4536-91fc-3d5cf7c630dd">
Feb 19 19:31:22 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <memory unit="KiB">131072</memory>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <currentMemory unit="KiB">131072</currentMemory>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <vcpu placement="static">1</vcpu>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <resource>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <partition>/machine</partition>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </resource>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <system>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="serial">c3ef5f5e-bbad-410e-b927-88f69b1cff5b</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="uuid">c3ef5f5e-bbad-410e-b927-88f69b1cff5b</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </system>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <os>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </os>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <features>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <vmcoreinfo state="on"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </features>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact" check="partial">
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <model fallback="allow">Nehalem</model>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <on_poweroff>destroy</on_poweroff>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <on_reboot>restart</on_reboot>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <on_crash>destroy</on_crash>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk.config"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <readonly/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="0" model="pcie-root"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="1" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="1" port="0x10"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="2" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="2" port="0x11"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="3" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="3" port="0x12"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="4" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="4" port="0x13"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="5" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="5" port="0x14"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="6" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="6" port="0x15"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="7" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="7" port="0x16"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="8" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="8" port="0x17"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="9" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="9" port="0x18"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="10" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="10" port="0x19"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="11" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="11" port="0x1a"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="12" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="12" port="0x1b"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="13" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="13" port="0x1c"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="14" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="14" port="0x1d"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="15" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="15" port="0x1e"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="16" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="16" port="0x1f"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="17" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="17" port="0x20"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="18" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="18" port="0x21"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="19" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="19" port="0x22"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="20" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="20" port="0x23"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="21" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="21" port="0x24"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="22" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="22" port="0x25"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="23" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="23" port="0x26"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="24" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="24" port="0x27"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="25" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="25" port="0x28"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-pci-bridge"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="usb" index="0" model="piix3-uhci">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="sata" index="0">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <interface type="ethernet"><mac address="fa:16:3e:8e:a5:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3a879a6d-2e"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </interface><serial type="pty">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/console.log" append="off"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target type="isa-serial" port="0">
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <model name="isa-serial"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </target>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <console type="pty">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/console.log" append="off"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target type="serial" port="0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </console>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="usb" bus="0" port="1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </input>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <input type="mouse" bus="ps2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <listen type="address" address="::"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <video>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model type="virtio" heads="1" primary="yes"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </video>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]: </domain>
Feb 19 19:31:22 compute-0 nova_compute[186662]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.602 186666 DEBUG nova.virt.libvirt.migration [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <name>instance-0000000b</name>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <uuid>c3ef5f5e-bbad-410e-b927-88f69b1cff5b</uuid>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteBasicStrategy-server-2040083590</nova:name>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:30:34</nova:creationTime>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:31:22 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:31:22 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:user uuid="856c67cb702b4cd7a27eb3f75ca31608">tempest-TestExecuteBasicStrategy-717047797-project-admin</nova:user>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:project uuid="c4336eca41984cf8bd35f5039db90e19">tempest-TestExecuteBasicStrategy-717047797</nova:project>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:port uuid="3a879a6d-2e72-4536-91fc-3d5cf7c630dd">
Feb 19 19:31:22 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <memory unit="KiB">131072</memory>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <currentMemory unit="KiB">131072</currentMemory>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <vcpu placement="static">1</vcpu>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <resource>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <partition>/machine</partition>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </resource>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <system>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="serial">c3ef5f5e-bbad-410e-b927-88f69b1cff5b</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="uuid">c3ef5f5e-bbad-410e-b927-88f69b1cff5b</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </system>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <os>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </os>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <features>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <vmcoreinfo state="on"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </features>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact" check="partial">
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <model fallback="allow">Nehalem</model>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <on_poweroff>destroy</on_poweroff>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <on_reboot>restart</on_reboot>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <on_crash>destroy</on_crash>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk.config"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <readonly/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="0" model="pcie-root"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="1" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="1" port="0x10"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="2" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="2" port="0x11"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="3" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="3" port="0x12"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="4" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="4" port="0x13"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="5" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="5" port="0x14"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="6" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="6" port="0x15"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="7" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="7" port="0x16"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="8" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="8" port="0x17"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="9" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="9" port="0x18"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="10" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="10" port="0x19"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="11" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="11" port="0x1a"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="12" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="12" port="0x1b"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="13" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="13" port="0x1c"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="14" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="14" port="0x1d"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="15" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="15" port="0x1e"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="16" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="16" port="0x1f"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="17" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="17" port="0x20"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="18" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="18" port="0x21"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="19" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="19" port="0x22"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="20" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="20" port="0x23"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="21" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="21" port="0x24"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="22" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="22" port="0x25"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="23" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="23" port="0x26"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="24" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="24" port="0x27"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="25" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="25" port="0x28"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-pci-bridge"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="usb" index="0" model="piix3-uhci">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="sata" index="0">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <interface type="ethernet"><mac address="fa:16:3e:8e:a5:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3a879a6d-2e"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </interface><serial type="pty">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/console.log" append="off"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target type="isa-serial" port="0">
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <model name="isa-serial"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </target>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <console type="pty">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/console.log" append="off"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target type="serial" port="0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </console>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="usb" bus="0" port="1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </input>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <input type="mouse" bus="ps2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <listen type="address" address="::"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <video>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model type="virtio" heads="1" primary="yes"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </video>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]: </domain>
Feb 19 19:31:22 compute-0 nova_compute[186662]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.603 186666 DEBUG nova.virt.libvirt.migration [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] _update_pci_xml output xml=<domain type="kvm">
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <name>instance-0000000b</name>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <uuid>c3ef5f5e-bbad-410e-b927-88f69b1cff5b</uuid>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteBasicStrategy-server-2040083590</nova:name>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:30:34</nova:creationTime>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:31:22 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:31:22 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:user uuid="856c67cb702b4cd7a27eb3f75ca31608">tempest-TestExecuteBasicStrategy-717047797-project-admin</nova:user>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:project uuid="c4336eca41984cf8bd35f5039db90e19">tempest-TestExecuteBasicStrategy-717047797</nova:project>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <nova:port uuid="3a879a6d-2e72-4536-91fc-3d5cf7c630dd">
Feb 19 19:31:22 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <memory unit="KiB">131072</memory>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <currentMemory unit="KiB">131072</currentMemory>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <vcpu placement="static">1</vcpu>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <resource>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <partition>/machine</partition>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </resource>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <system>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="serial">c3ef5f5e-bbad-410e-b927-88f69b1cff5b</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="uuid">c3ef5f5e-bbad-410e-b927-88f69b1cff5b</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </system>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <os>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </os>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <features>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <vmcoreinfo state="on"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </features>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact" check="partial">
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <model fallback="allow">Nehalem</model>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <on_poweroff>destroy</on_poweroff>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <on_reboot>restart</on_reboot>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <on_crash>destroy</on_crash>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/disk.config"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <readonly/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="0" model="pcie-root"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="1" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="1" port="0x10"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="2" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="2" port="0x11"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="3" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="3" port="0x12"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="4" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="4" port="0x13"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="5" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="5" port="0x14"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="6" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="6" port="0x15"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="7" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="7" port="0x16"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="8" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="8" port="0x17"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="9" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="9" port="0x18"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="10" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="10" port="0x19"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="11" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="11" port="0x1a"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="12" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="12" port="0x1b"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="13" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="13" port="0x1c"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="14" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="14" port="0x1d"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="15" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="15" port="0x1e"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="16" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="16" port="0x1f"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="17" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="17" port="0x20"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="18" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="18" port="0x21"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="19" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="19" port="0x22"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="20" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="20" port="0x23"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="21" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="21" port="0x24"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="22" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="22" port="0x25"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="23" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="23" port="0x26"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="24" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="24" port="0x27"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="25" model="pcie-root-port">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target chassis="25" port="0x28"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model name="pcie-pci-bridge"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="usb" index="0" model="piix3-uhci">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <controller type="sata" index="0">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <interface type="ethernet"><mac address="fa:16:3e:8e:a5:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3a879a6d-2e"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </interface><serial type="pty">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/console.log" append="off"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target type="isa-serial" port="0">
Feb 19 19:31:22 compute-0 nova_compute[186662]:         <model name="isa-serial"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       </target>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <console type="pty">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b/console.log" append="off"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <target type="serial" port="0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </console>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="usb" bus="0" port="1"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </input>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <input type="mouse" bus="ps2"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <listen type="address" address="::"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <video>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <model type="virtio" heads="1" primary="yes"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </video>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:31:22 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:31:22 compute-0 nova_compute[186662]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Feb 19 19:31:22 compute-0 nova_compute[186662]: </domain>
Feb 19 19:31:22 compute-0 nova_compute[186662]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.604 186666 DEBUG nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Feb 19 19:31:22 compute-0 nova_compute[186662]: 2026-02-19 19:31:22.655 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:23 compute-0 nova_compute[186662]: 2026-02-19 19:31:23.090 186666 DEBUG nova.virt.libvirt.migration [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Feb 19 19:31:23 compute-0 nova_compute[186662]: 2026-02-19 19:31:23.091 186666 INFO nova.virt.libvirt.migration [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 19 19:31:23 compute-0 kernel: tap3a879a6d-2e (unregistering): left promiscuous mode
Feb 19 19:31:23 compute-0 NetworkManager[56519]: <info>  [1771529483.9987] device (tap3a879a6d-2e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:23.999 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:24 compute-0 ovn_controller[96653]: 2026-02-19T19:31:24Z|00101|binding|INFO|Releasing lport 3a879a6d-2e72-4536-91fc-3d5cf7c630dd from this chassis (sb_readonly=0)
Feb 19 19:31:24 compute-0 ovn_controller[96653]: 2026-02-19T19:31:24Z|00102|binding|INFO|Setting lport 3a879a6d-2e72-4536-91fc-3d5cf7c630dd down in Southbound
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.004 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:24 compute-0 ovn_controller[96653]: 2026-02-19T19:31:24Z|00103|binding|INFO|Removing iface tap3a879a6d-2e ovn-installed in OVS
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.006 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.008 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.014 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:a5:99 10.100.0.11'], port_security=['fa:16:3e:8e:a5:99 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'd8481919-b10e-4218-b697-835a5c48ac63'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c3ef5f5e-bbad-410e-b927-88f69b1cff5b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a543a13-7551-4f54-8112-48a9ec62e1bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4336eca41984cf8bd35f5039db90e19', 'neutron:revision_number': '10', 'neutron:security_group_ids': '08ac8fcf-f732-4a9b-bd03-253fb11db4ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6837b7ea-7979-471d-8324-067c4fb4f754, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=3a879a6d-2e72-4536-91fc-3d5cf7c630dd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.015 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 3a879a6d-2e72-4536-91fc-3d5cf7c630dd in datapath 0a543a13-7551-4f54-8112-48a9ec62e1bb unbound from our chassis
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.016 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a543a13-7551-4f54-8112-48a9ec62e1bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.018 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ab8179e1-b0f2-4def-957a-5a9825b69c80]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.019 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb namespace which is not needed anymore
Feb 19 19:31:24 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Feb 19 19:31:24 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Consumed 13.521s CPU time.
Feb 19 19:31:24 compute-0 systemd-machined[156014]: Machine qemu-8-instance-0000000b terminated.
Feb 19 19:31:24 compute-0 neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb[211078]: [NOTICE]   (211082) : haproxy version is 3.0.5-8e879a5
Feb 19 19:31:24 compute-0 neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb[211078]: [NOTICE]   (211082) : path to executable is /usr/sbin/haproxy
Feb 19 19:31:24 compute-0 neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb[211078]: [WARNING]  (211082) : Exiting Master process...
Feb 19 19:31:24 compute-0 podman[211277]: 2026-02-19 19:31:24.102099762 +0000 UTC m=+0.022043531 container kill 7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.43.0)
Feb 19 19:31:24 compute-0 neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb[211078]: [ALERT]    (211082) : Current worker (211084) exited with code 143 (Terminated)
Feb 19 19:31:24 compute-0 neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb[211078]: [WARNING]  (211082) : All workers exited. Exiting... (0)
Feb 19 19:31:24 compute-0 systemd[1]: libpod-7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb.scope: Deactivated successfully.
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.110 186666 INFO nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 19 19:31:24 compute-0 podman[211294]: 2026-02-19 19:31:24.134508611 +0000 UTC m=+0.017287886 container died 7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216)
Feb 19 19:31:24 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb-userdata-shm.mount: Deactivated successfully.
Feb 19 19:31:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-b9a6c8dcbee83753ca300132a64a44c7ad63def23be2d96c33337d493faab278-merged.mount: Deactivated successfully.
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.158 186666 DEBUG nova.compute.manager [req-b2de2a4d-fbee-4a86-b9b7-9f1335c06596 req-fea75e41-0e3b-4393-91e3-2bd0001b0c34 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-unplugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:31:24 compute-0 podman[211294]: 2026-02-19 19:31:24.159053581 +0000 UTC m=+0.041832846 container cleanup 7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.159 186666 DEBUG oslo_concurrency.lockutils [req-b2de2a4d-fbee-4a86-b9b7-9f1335c06596 req-fea75e41-0e3b-4393-91e3-2bd0001b0c34 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.159 186666 DEBUG oslo_concurrency.lockutils [req-b2de2a4d-fbee-4a86-b9b7-9f1335c06596 req-fea75e41-0e3b-4393-91e3-2bd0001b0c34 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.160 186666 DEBUG oslo_concurrency.lockutils [req-b2de2a4d-fbee-4a86-b9b7-9f1335c06596 req-fea75e41-0e3b-4393-91e3-2bd0001b0c34 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.160 186666 DEBUG nova.compute.manager [req-b2de2a4d-fbee-4a86-b9b7-9f1335c06596 req-fea75e41-0e3b-4393-91e3-2bd0001b0c34 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] No waiting events found dispatching network-vif-unplugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.160 186666 DEBUG nova.compute.manager [req-b2de2a4d-fbee-4a86-b9b7-9f1335c06596 req-fea75e41-0e3b-4393-91e3-2bd0001b0c34 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-unplugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:31:24 compute-0 systemd[1]: libpod-conmon-7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb.scope: Deactivated successfully.
Feb 19 19:31:24 compute-0 podman[211293]: 2026-02-19 19:31:24.172113046 +0000 UTC m=+0.054326388 container remove 7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.175 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[3bcf312b-17da-4f97-92b8-49f2f4ba5106]: (4, ("Thu Feb 19 07:31:24 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb (7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb)\n7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb\nThu Feb 19 07:31:24 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb (7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb)\n7c1824098211f7ec6db2805b732d6e2e808d4510a53649770ee73e568bf8f5eb\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.176 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[14af677d-3e4f-4e67-b6bc-ec66ecc566cb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.176 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a543a13-7551-4f54-8112-48a9ec62e1bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a543a13-7551-4f54-8112-48a9ec62e1bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.176 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[fd7b5bed-735f-4ba9-930a-d19c51e3231a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.177 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a543a13-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.178 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:24 compute-0 kernel: tap0a543a13-70: left promiscuous mode
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.186 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.188 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ea6dc7-0648-4365-966f-7303d2af4486]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.202 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[841c0b93-d1b0-4b30-b925-09dfc7a9d67f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.203 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[604ff6da-0ccc-423a-8692-df0df67a339f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.213 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[3d3e1a24-18f9-433e-a331-653551bec79d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 370687, 'reachable_time': 28110, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211337, 'error': None, 'target': 'ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:31:24 compute-0 systemd[1]: run-netns-ovnmeta\x2d0a543a13\x2d7551\x2d4f54\x2d8112\x2d48a9ec62e1bb.mount: Deactivated successfully.
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.215 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0a543a13-7551-4f54-8112-48a9ec62e1bb deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:31:24 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:24.215 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[1e43653f-0a8f-4324-a817-560ffb6822b9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.232 186666 DEBUG nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.233 186666 DEBUG nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.233 186666 DEBUG nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.612 186666 DEBUG nova.virt.libvirt.guest [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'c3ef5f5e-bbad-410e-b927-88f69b1cff5b' (instance-0000000b) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.613 186666 INFO nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Migration operation has completed
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.613 186666 INFO nova.compute.manager [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] _post_live_migration() is started..
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.862 186666 WARNING neutronclient.v2_0.client [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.862 186666 WARNING neutronclient.v2_0.client [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:31:24 compute-0 nova_compute[186662]: 2026-02-19 19:31:24.934 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.693 186666 DEBUG nova.network.neutron [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Activated binding for port 3a879a6d-2e72-4536-91fc-3d5cf7c630dd and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.694 186666 DEBUG nova.compute.manager [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.694 186666 DEBUG nova.virt.libvirt.vif [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:30:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-2040083590',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-2040083590',id=11,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:30:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c4336eca41984cf8bd35f5039db90e19',ramdisk_id='',reservation_id='r-wobtfndh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-717047797',owner_user_name='tempest-TestExecuteBasicStrategy-717047797-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:31:00Z,user_data=None,user_id='856c67cb702b4cd7a27eb3f75ca31608',uuid=c3ef5f5e-bbad-410e-b927-88f69b1cff5b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.694 186666 DEBUG nova.network.os_vif_util [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "address": "fa:16:3e:8e:a5:99", "network": {"id": "0a543a13-7551-4f54-8112-48a9ec62e1bb", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1720256855-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0db250c0dbdb430dab926e3c448a8a79", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a879a6d-2e", "ovs_interfaceid": "3a879a6d-2e72-4536-91fc-3d5cf7c630dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.695 186666 DEBUG nova.network.os_vif_util [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:a5:99,bridge_name='br-int',has_traffic_filtering=True,id=3a879a6d-2e72-4536-91fc-3d5cf7c630dd,network=Network(0a543a13-7551-4f54-8112-48a9ec62e1bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a879a6d-2e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.695 186666 DEBUG os_vif [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:a5:99,bridge_name='br-int',has_traffic_filtering=True,id=3a879a6d-2e72-4536-91fc-3d5cf7c630dd,network=Network(0a543a13-7551-4f54-8112-48a9ec62e1bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a879a6d-2e') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.696 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.697 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a879a6d-2e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.698 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.700 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.701 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.702 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.702 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=dd92cae3-6d6a-46e1-ae22-1fa92bb16aac) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.703 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.705 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.715 186666 INFO os_vif [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:a5:99,bridge_name='br-int',has_traffic_filtering=True,id=3a879a6d-2e72-4536-91fc-3d5cf7c630dd,network=Network(0a543a13-7551-4f54-8112-48a9ec62e1bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a879a6d-2e')
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.716 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.716 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.716 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.717 186666 DEBUG nova.compute.manager [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.717 186666 INFO nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Deleting instance files /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b_del
Feb 19 19:31:25 compute-0 nova_compute[186662]: 2026-02-19 19:31:25.718 186666 INFO nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Deletion of /var/lib/nova/instances/c3ef5f5e-bbad-410e-b927-88f69b1cff5b_del complete
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.219 186666 DEBUG nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.219 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.219 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.219 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.219 186666 DEBUG nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] No waiting events found dispatching network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.219 186666 WARNING nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received unexpected event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd for instance with vm_state active and task_state migrating.
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.220 186666 DEBUG nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-unplugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.220 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.220 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.220 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.220 186666 DEBUG nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] No waiting events found dispatching network-vif-unplugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.220 186666 DEBUG nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-unplugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.220 186666 DEBUG nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-unplugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.220 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.221 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.221 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.221 186666 DEBUG nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] No waiting events found dispatching network-vif-unplugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.221 186666 DEBUG nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-unplugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.221 186666 DEBUG nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.221 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.221 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.221 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.222 186666 DEBUG nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] No waiting events found dispatching network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.222 186666 WARNING nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received unexpected event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd for instance with vm_state active and task_state migrating.
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.222 186666 DEBUG nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.222 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.222 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.222 186666 DEBUG oslo_concurrency.lockutils [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.222 186666 DEBUG nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] No waiting events found dispatching network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:31:26 compute-0 nova_compute[186662]: 2026-02-19 19:31:26.222 186666 WARNING nova.compute.manager [req-d56819a6-9f00-49aa-9f92-a7c51c2e0cc5 req-0b88a5f2-ea98-46bf-a38b-46cbd5e75fb3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Received unexpected event network-vif-plugged-3a879a6d-2e72-4536-91fc-3d5cf7c630dd for instance with vm_state active and task_state migrating.
Feb 19 19:31:27 compute-0 nova_compute[186662]: 2026-02-19 19:31:27.657 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:29 compute-0 podman[196025]: time="2026-02-19T19:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:31:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:31:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Feb 19 19:31:30 compute-0 nova_compute[186662]: 2026-02-19 19:31:30.703 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:31 compute-0 openstack_network_exporter[198916]: ERROR   19:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:31:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:31:31 compute-0 openstack_network_exporter[198916]: ERROR   19:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:31:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:31:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:32.128 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:32.128 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:32.128 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:32 compute-0 nova_compute[186662]: 2026-02-19 19:31:32.659 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:33 compute-0 podman[211346]: 2026-02-19 19:31:33.26217735 +0000 UTC m=+0.043531368 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 19 19:31:35 compute-0 nova_compute[186662]: 2026-02-19 19:31:35.704 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:35 compute-0 nova_compute[186662]: 2026-02-19 19:31:35.749 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:35 compute-0 nova_compute[186662]: 2026-02-19 19:31:35.750 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:35 compute-0 nova_compute[186662]: 2026-02-19 19:31:35.750 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "c3ef5f5e-bbad-410e-b927-88f69b1cff5b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:36 compute-0 nova_compute[186662]: 2026-02-19 19:31:36.263 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:36 compute-0 nova_compute[186662]: 2026-02-19 19:31:36.263 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:36 compute-0 nova_compute[186662]: 2026-02-19 19:31:36.263 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:36 compute-0 nova_compute[186662]: 2026-02-19 19:31:36.264 186666 DEBUG nova.compute.resource_tracker [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:31:36 compute-0 nova_compute[186662]: 2026-02-19 19:31:36.398 186666 WARNING nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:31:36 compute-0 nova_compute[186662]: 2026-02-19 19:31:36.399 186666 DEBUG oslo_concurrency.processutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:31:36 compute-0 nova_compute[186662]: 2026-02-19 19:31:36.414 186666 DEBUG oslo_concurrency.processutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:31:36 compute-0 nova_compute[186662]: 2026-02-19 19:31:36.414 186666 DEBUG nova.compute.resource_tracker [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5827MB free_disk=72.97673416137695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:31:36 compute-0 nova_compute[186662]: 2026-02-19 19:31:36.415 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:31:36 compute-0 nova_compute[186662]: 2026-02-19 19:31:36.415 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:31:37 compute-0 podman[211367]: 2026-02-19 19:31:37.314606059 +0000 UTC m=+0.090041217 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 19 19:31:37 compute-0 nova_compute[186662]: 2026-02-19 19:31:37.430 186666 DEBUG nova.compute.resource_tracker [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Migration for instance c3ef5f5e-bbad-410e-b927-88f69b1cff5b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Feb 19 19:31:37 compute-0 nova_compute[186662]: 2026-02-19 19:31:37.710 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:37 compute-0 nova_compute[186662]: 2026-02-19 19:31:37.940 186666 DEBUG nova.compute.resource_tracker [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Feb 19 19:31:37 compute-0 nova_compute[186662]: 2026-02-19 19:31:37.964 186666 DEBUG nova.compute.resource_tracker [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Migration e63fda7d-35b6-4fe0-bd0f-5282f7989ffa is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Feb 19 19:31:37 compute-0 nova_compute[186662]: 2026-02-19 19:31:37.965 186666 DEBUG nova.compute.resource_tracker [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:31:37 compute-0 nova_compute[186662]: 2026-02-19 19:31:37.965 186666 DEBUG nova.compute.resource_tracker [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:31:36 up  1:02,  0 user,  load average: 0.17, 0.30, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:31:37 compute-0 nova_compute[186662]: 2026-02-19 19:31:37.998 186666 DEBUG nova.compute.provider_tree [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:31:38 compute-0 nova_compute[186662]: 2026-02-19 19:31:38.505 186666 DEBUG nova.scheduler.client.report [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:31:38 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:38.825 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:31:38 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:38.825 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:31:38 compute-0 nova_compute[186662]: 2026-02-19 19:31:38.871 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:39 compute-0 nova_compute[186662]: 2026-02-19 19:31:39.020 186666 DEBUG nova.compute.resource_tracker [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:31:39 compute-0 nova_compute[186662]: 2026-02-19 19:31:39.020 186666 DEBUG oslo_concurrency.lockutils [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.605s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:31:39 compute-0 nova_compute[186662]: 2026-02-19 19:31:39.048 186666 INFO nova.compute.manager [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 19 19:31:39 compute-0 podman[211389]: 2026-02-19 19:31:39.309823598 +0000 UTC m=+0.087566216 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Feb 19 19:31:40 compute-0 nova_compute[186662]: 2026-02-19 19:31:40.133 186666 INFO nova.scheduler.client.report [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Deleted allocation for migration e63fda7d-35b6-4fe0-bd0f-5282f7989ffa
Feb 19 19:31:40 compute-0 nova_compute[186662]: 2026-02-19 19:31:40.133 186666 DEBUG nova.virt.libvirt.driver [None req-625f0f70-a3a7-4d92-beae-cfa5f1279756 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: c3ef5f5e-bbad-410e-b927-88f69b1cff5b] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Feb 19 19:31:40 compute-0 nova_compute[186662]: 2026-02-19 19:31:40.706 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:41 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:31:41.827 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:31:42 compute-0 nova_compute[186662]: 2026-02-19 19:31:42.767 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:45 compute-0 nova_compute[186662]: 2026-02-19 19:31:45.707 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:46 compute-0 sshd-session[211416]: Invalid user n8n from 182.75.216.74 port 15847
Feb 19 19:31:47 compute-0 podman[211418]: 2026-02-19 19:31:47.000249159 +0000 UTC m=+0.048617200 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:31:47 compute-0 sshd-session[211416]: Received disconnect from 182.75.216.74 port 15847:11: Bye Bye [preauth]
Feb 19 19:31:47 compute-0 sshd-session[211416]: Disconnected from invalid user n8n 182.75.216.74 port 15847 [preauth]
Feb 19 19:31:47 compute-0 nova_compute[186662]: 2026-02-19 19:31:47.809 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:50 compute-0 nova_compute[186662]: 2026-02-19 19:31:50.430 186666 DEBUG nova.compute.manager [None req-4abdd177-8ddc-400d-851f-ace5418fbecf b61936f8600641abb9e2d5787407b4b1 084bf37190834c4d9a8f0459d9d05ec7 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Feb 19 19:31:50 compute-0 nova_compute[186662]: 2026-02-19 19:31:50.491 186666 DEBUG nova.compute.provider_tree [None req-4abdd177-8ddc-400d-851f-ace5418fbecf b61936f8600641abb9e2d5787407b4b1 084bf37190834c4d9a8f0459d9d05ec7 - - default default] Updating resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 generation from 12 to 15 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Feb 19 19:31:50 compute-0 nova_compute[186662]: 2026-02-19 19:31:50.708 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:52 compute-0 nova_compute[186662]: 2026-02-19 19:31:52.856 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:53 compute-0 nova_compute[186662]: 2026-02-19 19:31:53.848 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:55 compute-0 nova_compute[186662]: 2026-02-19 19:31:55.710 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:57 compute-0 nova_compute[186662]: 2026-02-19 19:31:57.890 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:31:59 compute-0 podman[196025]: time="2026-02-19T19:31:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:31:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:31:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:31:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:31:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Feb 19 19:32:00 compute-0 nova_compute[186662]: 2026-02-19 19:32:00.712 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:01 compute-0 openstack_network_exporter[198916]: ERROR   19:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:32:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:32:01 compute-0 openstack_network_exporter[198916]: ERROR   19:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:32:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:32:02 compute-0 nova_compute[186662]: 2026-02-19 19:32:02.892 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:02 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:02.923 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:d3:cd 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-07126e82-e19d-4936-a9e3-2f12cd31560f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-07126e82-e19d-4936-a9e3-2f12cd31560f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e0ce41b272247b3a9e7e216a6df6e3d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cf1c528-1140-4c8d-8882-275082ae22a2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=56a58813-c597-4bec-a78c-1af2fdf052a6) old=Port_Binding(mac=['fa:16:3e:cf:d3:cd'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-07126e82-e19d-4936-a9e3-2f12cd31560f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-07126e82-e19d-4936-a9e3-2f12cd31560f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e0ce41b272247b3a9e7e216a6df6e3d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:32:02 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:02.924 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 56a58813-c597-4bec-a78c-1af2fdf052a6 in datapath 07126e82-e19d-4936-a9e3-2f12cd31560f updated
Feb 19 19:32:02 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:02.925 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 07126e82-e19d-4936-a9e3-2f12cd31560f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:32:02 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:02.926 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ae1cafc3-0a40-42c4-800a-bdff7abd1043]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:32:04 compute-0 podman[211444]: 2026-02-19 19:32:04.262281935 +0000 UTC m=+0.040699930 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:32:05 compute-0 nova_compute[186662]: 2026-02-19 19:32:05.713 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:07 compute-0 nova_compute[186662]: 2026-02-19 19:32:07.895 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:08 compute-0 podman[211462]: 2026-02-19 19:32:08.269596028 +0000 UTC m=+0.047076094 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 19 19:32:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:09.875 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:12:0a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-1f97a95d-3922-41fc-80db-cb33ebe01f35', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f97a95d-3922-41fc-80db-cb33ebe01f35', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8358100e0bd419880b809dda56dca10', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d4b49c8-eab8-41b4-8c84-7b2b9063a230, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e6227cd2-3023-493d-9f80-eadca64bf0b2) old=Port_Binding(mac=['fa:16:3e:5b:12:0a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-1f97a95d-3922-41fc-80db-cb33ebe01f35', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f97a95d-3922-41fc-80db-cb33ebe01f35', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8358100e0bd419880b809dda56dca10', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:32:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:09.876 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e6227cd2-3023-493d-9f80-eadca64bf0b2 in datapath 1f97a95d-3922-41fc-80db-cb33ebe01f35 updated
Feb 19 19:32:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:09.878 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1f97a95d-3922-41fc-80db-cb33ebe01f35, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:32:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:09.879 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[89d0d21a-c995-41d3-875f-82b8f21b65df]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:32:10 compute-0 podman[211483]: 2026-02-19 19:32:10.284420549 +0000 UTC m=+0.060945257 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0)
Feb 19 19:32:10 compute-0 nova_compute[186662]: 2026-02-19 19:32:10.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:32:10 compute-0 nova_compute[186662]: 2026-02-19 19:32:10.715 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:11 compute-0 nova_compute[186662]: 2026-02-19 19:32:11.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:32:11 compute-0 nova_compute[186662]: 2026-02-19 19:32:11.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:32:11 compute-0 nova_compute[186662]: 2026-02-19 19:32:11.576 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:32:12 compute-0 nova_compute[186662]: 2026-02-19 19:32:12.896 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:14 compute-0 nova_compute[186662]: 2026-02-19 19:32:14.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:32:15 compute-0 nova_compute[186662]: 2026-02-19 19:32:15.716 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:16 compute-0 nova_compute[186662]: 2026-02-19 19:32:16.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:32:16 compute-0 nova_compute[186662]: 2026-02-19 19:32:16.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:32:16 compute-0 nova_compute[186662]: 2026-02-19 19:32:16.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:32:17 compute-0 nova_compute[186662]: 2026-02-19 19:32:17.091 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:32:17 compute-0 nova_compute[186662]: 2026-02-19 19:32:17.091 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:32:17 compute-0 nova_compute[186662]: 2026-02-19 19:32:17.091 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:32:17 compute-0 nova_compute[186662]: 2026-02-19 19:32:17.091 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:32:17 compute-0 nova_compute[186662]: 2026-02-19 19:32:17.208 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:32:17 compute-0 nova_compute[186662]: 2026-02-19 19:32:17.209 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:32:17 compute-0 nova_compute[186662]: 2026-02-19 19:32:17.220 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.011s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:32:17 compute-0 nova_compute[186662]: 2026-02-19 19:32:17.220 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5840MB free_disk=72.97673416137695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:32:17 compute-0 nova_compute[186662]: 2026-02-19 19:32:17.221 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:32:17 compute-0 nova_compute[186662]: 2026-02-19 19:32:17.221 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:32:17 compute-0 podman[211509]: 2026-02-19 19:32:17.281471172 +0000 UTC m=+0.045942906 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:32:17 compute-0 nova_compute[186662]: 2026-02-19 19:32:17.898 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:18 compute-0 nova_compute[186662]: 2026-02-19 19:32:18.270 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:32:18 compute-0 nova_compute[186662]: 2026-02-19 19:32:18.270 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:32:17 up  1:03,  0 user,  load average: 0.08, 0.26, 0.33\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:32:18 compute-0 nova_compute[186662]: 2026-02-19 19:32:18.283 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:32:18 compute-0 nova_compute[186662]: 2026-02-19 19:32:18.790 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:32:19 compute-0 nova_compute[186662]: 2026-02-19 19:32:19.301 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:32:19 compute-0 nova_compute[186662]: 2026-02-19 19:32:19.302 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.081s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:32:20 compute-0 nova_compute[186662]: 2026-02-19 19:32:20.717 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:21 compute-0 nova_compute[186662]: 2026-02-19 19:32:21.298 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:32:22 compute-0 nova_compute[186662]: 2026-02-19 19:32:22.901 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:25 compute-0 nova_compute[186662]: 2026-02-19 19:32:25.718 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:27 compute-0 ovn_controller[96653]: 2026-02-19T19:32:27Z|00104|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 19 19:32:27 compute-0 nova_compute[186662]: 2026-02-19 19:32:27.902 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:29 compute-0 podman[196025]: time="2026-02-19T19:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:32:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:32:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Feb 19 19:32:30 compute-0 nova_compute[186662]: 2026-02-19 19:32:30.720 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:31 compute-0 openstack_network_exporter[198916]: ERROR   19:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:32:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:32:31 compute-0 openstack_network_exporter[198916]: ERROR   19:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:32:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:32:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:32.129 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:32:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:32.129 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:32:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:32.129 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:32:32 compute-0 nova_compute[186662]: 2026-02-19 19:32:32.903 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:35 compute-0 podman[211536]: 2026-02-19 19:32:35.307506 +0000 UTC m=+0.081564133 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 19 19:32:35 compute-0 nova_compute[186662]: 2026-02-19 19:32:35.722 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:37 compute-0 nova_compute[186662]: 2026-02-19 19:32:37.904 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:39 compute-0 podman[211556]: 2026-02-19 19:32:39.280156004 +0000 UTC m=+0.056042810 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, architecture=x86_64, build-date=2026-02-05T04:57:10Z, version=9.7, release=1770267347, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 19 19:32:39 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:39.657 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:32:39 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:39.657 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:32:39 compute-0 nova_compute[186662]: 2026-02-19 19:32:39.658 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:40.720 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:25:82 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61a8fd888cc1408eaeded54a293416ed', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a17ccb-74c9-404e-969f-7001a470c860, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e682361b-89b8-4c67-9593-6b6d57e3096a) old=Port_Binding(mac=['fa:16:3e:dc:25:82'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61a8fd888cc1408eaeded54a293416ed', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:32:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:40.721 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e682361b-89b8-4c67-9593-6b6d57e3096a in datapath f37b00da-2392-46ae-ac87-2c54ab8961a2 updated
Feb 19 19:32:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:40.722 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f37b00da-2392-46ae-ac87-2c54ab8961a2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:32:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:40.723 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[9ecfeb6e-ceb3-49db-8746-c2f4a1bbf8b1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:32:40 compute-0 nova_compute[186662]: 2026-02-19 19:32:40.724 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:41 compute-0 podman[211576]: 2026-02-19 19:32:41.312211434 +0000 UTC m=+0.094494793 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Feb 19 19:32:42 compute-0 nova_compute[186662]: 2026-02-19 19:32:42.908 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:45 compute-0 nova_compute[186662]: 2026-02-19 19:32:45.725 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:46.659 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:32:47 compute-0 nova_compute[186662]: 2026-02-19 19:32:47.909 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:48 compute-0 podman[211604]: 2026-02-19 19:32:48.27877526 +0000 UTC m=+0.057874976 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:32:49 compute-0 sshd-session[211602]: Invalid user claude from 106.51.64.128 port 6120
Feb 19 19:32:49 compute-0 sshd-session[211602]: Received disconnect from 106.51.64.128 port 6120:11: Bye Bye [preauth]
Feb 19 19:32:49 compute-0 sshd-session[211602]: Disconnected from invalid user claude 106.51.64.128 port 6120 [preauth]
Feb 19 19:32:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:50.383 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:c8:f6 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-731154b8-34ea-4aad-a09c-11427905cdc6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-731154b8-34ea-4aad-a09c-11427905cdc6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87043e904d374d2fbc50a010c14c8987', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b185818-c810-4927-bc74-b440dfe6f7ea, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=02f56733-5977-4725-858e-077f211dda32) old=Port_Binding(mac=['fa:16:3e:53:c8:f6'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-731154b8-34ea-4aad-a09c-11427905cdc6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-731154b8-34ea-4aad-a09c-11427905cdc6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87043e904d374d2fbc50a010c14c8987', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:32:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:50.384 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 02f56733-5977-4725-858e-077f211dda32 in datapath 731154b8-34ea-4aad-a09c-11427905cdc6 updated
Feb 19 19:32:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:50.385 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 731154b8-34ea-4aad-a09c-11427905cdc6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:32:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:32:50.386 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[14981765-76f8-485a-a6ff-f6e5b8345d72]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:32:50 compute-0 nova_compute[186662]: 2026-02-19 19:32:50.726 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:52 compute-0 nova_compute[186662]: 2026-02-19 19:32:52.911 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:54 compute-0 sshd-session[211628]: Invalid user n8n from 189.165.79.177 port 56126
Feb 19 19:32:54 compute-0 sshd-session[211628]: Received disconnect from 189.165.79.177 port 56126:11: Bye Bye [preauth]
Feb 19 19:32:54 compute-0 sshd-session[211628]: Disconnected from invalid user n8n 189.165.79.177 port 56126 [preauth]
Feb 19 19:32:55 compute-0 nova_compute[186662]: 2026-02-19 19:32:55.728 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:57 compute-0 nova_compute[186662]: 2026-02-19 19:32:57.912 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:32:59 compute-0 podman[196025]: time="2026-02-19T19:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:32:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:32:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2196 "" "Go-http-client/1.1"
Feb 19 19:33:00 compute-0 nova_compute[186662]: 2026-02-19 19:33:00.730 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:01 compute-0 openstack_network_exporter[198916]: ERROR   19:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:33:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:33:01 compute-0 openstack_network_exporter[198916]: ERROR   19:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:33:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:33:02 compute-0 nova_compute[186662]: 2026-02-19 19:33:02.915 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:05 compute-0 nova_compute[186662]: 2026-02-19 19:33:05.732 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:06 compute-0 podman[211630]: 2026-02-19 19:33:06.287294525 +0000 UTC m=+0.058486561 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Feb 19 19:33:07 compute-0 nova_compute[186662]: 2026-02-19 19:33:07.917 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:10 compute-0 podman[211651]: 2026-02-19 19:33:10.328536268 +0000 UTC m=+0.095351824 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git)
Feb 19 19:33:10 compute-0 nova_compute[186662]: 2026-02-19 19:33:10.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:33:10 compute-0 nova_compute[186662]: 2026-02-19 19:33:10.733 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:12 compute-0 podman[211673]: 2026-02-19 19:33:12.306664531 +0000 UTC m=+0.077647485 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Feb 19 19:33:12 compute-0 nova_compute[186662]: 2026-02-19 19:33:12.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:33:12 compute-0 nova_compute[186662]: 2026-02-19 19:33:12.944 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:13 compute-0 nova_compute[186662]: 2026-02-19 19:33:13.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:33:13 compute-0 nova_compute[186662]: 2026-02-19 19:33:13.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:33:15 compute-0 nova_compute[186662]: 2026-02-19 19:33:15.735 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:16 compute-0 nova_compute[186662]: 2026-02-19 19:33:16.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:33:17 compute-0 nova_compute[186662]: 2026-02-19 19:33:17.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:33:17 compute-0 nova_compute[186662]: 2026-02-19 19:33:17.945 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:18 compute-0 nova_compute[186662]: 2026-02-19 19:33:18.078 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:33:18 compute-0 nova_compute[186662]: 2026-02-19 19:33:18.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:33:18 compute-0 nova_compute[186662]: 2026-02-19 19:33:18.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:33:19 compute-0 nova_compute[186662]: 2026-02-19 19:33:19.087 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:33:19 compute-0 nova_compute[186662]: 2026-02-19 19:33:19.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:33:19 compute-0 nova_compute[186662]: 2026-02-19 19:33:19.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:33:19 compute-0 nova_compute[186662]: 2026-02-19 19:33:19.088 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:33:19 compute-0 nova_compute[186662]: 2026-02-19 19:33:19.262 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:33:19 compute-0 nova_compute[186662]: 2026-02-19 19:33:19.264 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:33:19 compute-0 nova_compute[186662]: 2026-02-19 19:33:19.277 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:33:19 compute-0 nova_compute[186662]: 2026-02-19 19:33:19.278 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5843MB free_disk=72.97671127319336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:33:19 compute-0 nova_compute[186662]: 2026-02-19 19:33:19.279 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:33:19 compute-0 nova_compute[186662]: 2026-02-19 19:33:19.279 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:33:19 compute-0 podman[211700]: 2026-02-19 19:33:19.304783539 +0000 UTC m=+0.080492775 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:33:20 compute-0 nova_compute[186662]: 2026-02-19 19:33:20.331 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:33:20 compute-0 nova_compute[186662]: 2026-02-19 19:33:20.332 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:33:19 up  1:04,  0 user,  load average: 0.29, 0.29, 0.34\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:33:20 compute-0 nova_compute[186662]: 2026-02-19 19:33:20.355 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:33:20 compute-0 nova_compute[186662]: 2026-02-19 19:33:20.736 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:20 compute-0 nova_compute[186662]: 2026-02-19 19:33:20.862 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:33:21 compute-0 nova_compute[186662]: 2026-02-19 19:33:21.193 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:33:21 compute-0 nova_compute[186662]: 2026-02-19 19:33:21.194 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:33:21 compute-0 nova_compute[186662]: 2026-02-19 19:33:21.372 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:33:21 compute-0 nova_compute[186662]: 2026-02-19 19:33:21.372 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:33:21 compute-0 nova_compute[186662]: 2026-02-19 19:33:21.701 186666 DEBUG nova.compute.manager [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Feb 19 19:33:22 compute-0 nova_compute[186662]: 2026-02-19 19:33:22.240 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:33:22 compute-0 nova_compute[186662]: 2026-02-19 19:33:22.240 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:33:22 compute-0 nova_compute[186662]: 2026-02-19 19:33:22.247 186666 DEBUG nova.virt.hardware [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:33:22 compute-0 nova_compute[186662]: 2026-02-19 19:33:22.248 186666 INFO nova.compute.claims [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:33:22 compute-0 nova_compute[186662]: 2026-02-19 19:33:22.367 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:33:22 compute-0 nova_compute[186662]: 2026-02-19 19:33:22.948 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:23 compute-0 nova_compute[186662]: 2026-02-19 19:33:23.303 186666 DEBUG nova.compute.provider_tree [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:33:23 compute-0 nova_compute[186662]: 2026-02-19 19:33:23.811 186666 DEBUG nova.scheduler.client.report [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:33:24 compute-0 nova_compute[186662]: 2026-02-19 19:33:24.323 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.083s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:33:24 compute-0 nova_compute[186662]: 2026-02-19 19:33:24.324 186666 DEBUG nova.compute.manager [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Feb 19 19:33:24 compute-0 nova_compute[186662]: 2026-02-19 19:33:24.839 186666 DEBUG nova.compute.manager [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Feb 19 19:33:24 compute-0 nova_compute[186662]: 2026-02-19 19:33:24.840 186666 DEBUG nova.network.neutron [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Feb 19 19:33:24 compute-0 nova_compute[186662]: 2026-02-19 19:33:24.842 186666 WARNING neutronclient.v2_0.client [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:33:24 compute-0 nova_compute[186662]: 2026-02-19 19:33:24.843 186666 WARNING neutronclient.v2_0.client [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:33:25 compute-0 nova_compute[186662]: 2026-02-19 19:33:25.354 186666 INFO nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 19 19:33:25 compute-0 nova_compute[186662]: 2026-02-19 19:33:25.681 186666 DEBUG nova.network.neutron [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Successfully created port: 09d9f919-c2ab-4c91-896e-88de5d7337d7 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Feb 19 19:33:25 compute-0 nova_compute[186662]: 2026-02-19 19:33:25.739 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:25 compute-0 nova_compute[186662]: 2026-02-19 19:33:25.864 186666 DEBUG nova.compute.manager [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.879 186666 DEBUG nova.compute.manager [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.880 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.880 186666 INFO nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Creating image(s)
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.881 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "/var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.881 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "/var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.882 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "/var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.882 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.885 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.886 186666 DEBUG oslo_concurrency.processutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.922 186666 DEBUG oslo_concurrency.processutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.923 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.924 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.924 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.928 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.928 186666 DEBUG oslo_concurrency.processutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.970 186666 DEBUG oslo_concurrency.processutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.971 186666 DEBUG oslo_concurrency.processutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.994 186666 DEBUG oslo_concurrency.processutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.995 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.071s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:33:26 compute-0 nova_compute[186662]: 2026-02-19 19:33:26.995 186666 DEBUG oslo_concurrency.processutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.034 186666 DEBUG oslo_concurrency.processutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.035 186666 DEBUG nova.virt.disk.api [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Checking if we can resize image /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.035 186666 DEBUG oslo_concurrency.processutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.071 186666 DEBUG oslo_concurrency.processutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk --force-share --output=json" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.072 186666 DEBUG nova.virt.disk.api [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Cannot resize image /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.073 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.073 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Ensure instance console log exists: /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.073 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.074 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.074 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.755 186666 DEBUG nova.network.neutron [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Successfully updated port: 09d9f919-c2ab-4c91-896e-88de5d7337d7 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.806 186666 DEBUG nova.compute.manager [req-a6555fa9-cf3f-416f-9ae0-08e1e952d98e req-2418431f-282f-4a12-b43a-645e24a1fcd6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Received event network-changed-09d9f919-c2ab-4c91-896e-88de5d7337d7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.806 186666 DEBUG nova.compute.manager [req-a6555fa9-cf3f-416f-9ae0-08e1e952d98e req-2418431f-282f-4a12-b43a-645e24a1fcd6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Refreshing instance network info cache due to event network-changed-09d9f919-c2ab-4c91-896e-88de5d7337d7. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.806 186666 DEBUG oslo_concurrency.lockutils [req-a6555fa9-cf3f-416f-9ae0-08e1e952d98e req-2418431f-282f-4a12-b43a-645e24a1fcd6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-f844ee8e-6a84-4d80-aa1d-b8f5230eccce" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.807 186666 DEBUG oslo_concurrency.lockutils [req-a6555fa9-cf3f-416f-9ae0-08e1e952d98e req-2418431f-282f-4a12-b43a-645e24a1fcd6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-f844ee8e-6a84-4d80-aa1d-b8f5230eccce" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.807 186666 DEBUG nova.network.neutron [req-a6555fa9-cf3f-416f-9ae0-08e1e952d98e req-2418431f-282f-4a12-b43a-645e24a1fcd6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Refreshing network info cache for port 09d9f919-c2ab-4c91-896e-88de5d7337d7 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:33:27 compute-0 nova_compute[186662]: 2026-02-19 19:33:27.949 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:28 compute-0 nova_compute[186662]: 2026-02-19 19:33:28.260 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "refresh_cache-f844ee8e-6a84-4d80-aa1d-b8f5230eccce" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:33:28 compute-0 nova_compute[186662]: 2026-02-19 19:33:28.310 186666 WARNING neutronclient.v2_0.client [req-a6555fa9-cf3f-416f-9ae0-08e1e952d98e req-2418431f-282f-4a12-b43a-645e24a1fcd6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:33:28 compute-0 nova_compute[186662]: 2026-02-19 19:33:28.660 186666 DEBUG nova.network.neutron [req-a6555fa9-cf3f-416f-9ae0-08e1e952d98e req-2418431f-282f-4a12-b43a-645e24a1fcd6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:33:28 compute-0 nova_compute[186662]: 2026-02-19 19:33:28.795 186666 DEBUG nova.network.neutron [req-a6555fa9-cf3f-416f-9ae0-08e1e952d98e req-2418431f-282f-4a12-b43a-645e24a1fcd6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:33:29 compute-0 nova_compute[186662]: 2026-02-19 19:33:29.301 186666 DEBUG oslo_concurrency.lockutils [req-a6555fa9-cf3f-416f-9ae0-08e1e952d98e req-2418431f-282f-4a12-b43a-645e24a1fcd6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-f844ee8e-6a84-4d80-aa1d-b8f5230eccce" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:33:29 compute-0 nova_compute[186662]: 2026-02-19 19:33:29.302 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquired lock "refresh_cache-f844ee8e-6a84-4d80-aa1d-b8f5230eccce" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:33:29 compute-0 nova_compute[186662]: 2026-02-19 19:33:29.302 186666 DEBUG nova.network.neutron [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:33:29 compute-0 podman[196025]: time="2026-02-19T19:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:33:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:33:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Feb 19 19:33:30 compute-0 nova_compute[186662]: 2026-02-19 19:33:30.374 186666 DEBUG nova.network.neutron [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:33:30 compute-0 nova_compute[186662]: 2026-02-19 19:33:30.594 186666 WARNING neutronclient.v2_0.client [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:33:30 compute-0 nova_compute[186662]: 2026-02-19 19:33:30.740 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:30 compute-0 nova_compute[186662]: 2026-02-19 19:33:30.761 186666 DEBUG nova.network.neutron [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Updating instance_info_cache with network_info: [{"id": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "address": "fa:16:3e:ba:28:a5", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09d9f919-c2", "ovs_interfaceid": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.271 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Releasing lock "refresh_cache-f844ee8e-6a84-4d80-aa1d-b8f5230eccce" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.271 186666 DEBUG nova.compute.manager [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Instance network_info: |[{"id": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "address": "fa:16:3e:ba:28:a5", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09d9f919-c2", "ovs_interfaceid": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.275 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Start _get_guest_xml network_info=[{"id": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "address": "fa:16:3e:ba:28:a5", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09d9f919-c2", "ovs_interfaceid": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.279 186666 WARNING nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.280 186666 DEBUG nova.virt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1003432610', uuid='f844ee8e-6a84-4d80-aa1d-b8f5230eccce'), owner=OwnerMeta(userid='f74ba8e1becb4d8f83bb148785aac310', username='tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin', projectid='87043e904d374d2fbc50a010c14c8987', projectname='tempest-TestExecuteHostMaintenanceStrategy-488758453'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "address": "fa:16:3e:ba:28:a5", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09d9f919-c2", "ovs_interfaceid": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771529611.2808719) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.286 186666 DEBUG nova.virt.libvirt.host [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.286 186666 DEBUG nova.virt.libvirt.host [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.289 186666 DEBUG nova.virt.libvirt.host [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.290 186666 DEBUG nova.virt.libvirt.host [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.291 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.291 186666 DEBUG nova.virt.hardware [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:33Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.291 186666 DEBUG nova.virt.hardware [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.291 186666 DEBUG nova.virt.hardware [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.292 186666 DEBUG nova.virt.hardware [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.292 186666 DEBUG nova.virt.hardware [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.292 186666 DEBUG nova.virt.hardware [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.292 186666 DEBUG nova.virt.hardware [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.292 186666 DEBUG nova.virt.hardware [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.293 186666 DEBUG nova.virt.hardware [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.293 186666 DEBUG nova.virt.hardware [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.293 186666 DEBUG nova.virt.hardware [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.296 186666 DEBUG nova.virt.libvirt.vif [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:33:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1003432610',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1003432610',id=13,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='87043e904d374d2fbc50a010c14c8987',ramdisk_id='',reservation_id='r-lqs1x8kd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-488758453',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:33:25Z,user_data=None,user_id='f74ba8e1becb4d8f83bb148785aac310',uuid=f844ee8e-6a84-4d80-aa1d-b8f5230eccce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "address": "fa:16:3e:ba:28:a5", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09d9f919-c2", "ovs_interfaceid": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.296 186666 DEBUG nova.network.os_vif_util [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converting VIF {"id": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "address": "fa:16:3e:ba:28:a5", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09d9f919-c2", "ovs_interfaceid": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.297 186666 DEBUG nova.network.os_vif_util [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:28:a5,bridge_name='br-int',has_traffic_filtering=True,id=09d9f919-c2ab-4c91-896e-88de5d7337d7,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09d9f919-c2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.298 186666 DEBUG nova.objects.instance [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lazy-loading 'pci_devices' on Instance uuid f844ee8e-6a84-4d80-aa1d-b8f5230eccce obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:33:31 compute-0 openstack_network_exporter[198916]: ERROR   19:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:33:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:33:31 compute-0 openstack_network_exporter[198916]: ERROR   19:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:33:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.807 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:33:31 compute-0 nova_compute[186662]:   <uuid>f844ee8e-6a84-4d80-aa1d-b8f5230eccce</uuid>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   <name>instance-0000000d</name>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   <memory>131072</memory>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1003432610</nova:name>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:33:31</nova:creationTime>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:33:31 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:33:31 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:user uuid="f74ba8e1becb4d8f83bb148785aac310">tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin</nova:user>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:project uuid="87043e904d374d2fbc50a010c14c8987">tempest-TestExecuteHostMaintenanceStrategy-488758453</nova:project>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         <nova:port uuid="09d9f919-c2ab-4c91-896e-88de5d7337d7">
Feb 19 19:33:31 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <system>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <entry name="serial">f844ee8e-6a84-4d80-aa1d-b8f5230eccce</entry>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <entry name="uuid">f844ee8e-6a84-4d80-aa1d-b8f5230eccce</entry>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     </system>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   <os>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   </os>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   <features>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   </features>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk.config"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:ba:28:a5"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <target dev="tap09d9f919-c2"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/console.log" append="off"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <video>
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     </video>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:33:31 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:33:31 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:33:31 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:33:31 compute-0 nova_compute[186662]: </domain>
Feb 19 19:33:31 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.809 186666 DEBUG nova.compute.manager [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Preparing to wait for external event network-vif-plugged-09d9f919-c2ab-4c91-896e-88de5d7337d7 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.809 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.810 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.810 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.811 186666 DEBUG nova.virt.libvirt.vif [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:33:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1003432610',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1003432610',id=13,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='87043e904d374d2fbc50a010c14c8987',ramdisk_id='',reservation_id='r-lqs1x8kd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-488758453',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:33:25Z,user_data=None,user_id='f74ba8e1becb4d8f83bb148785aac310',uuid=f844ee8e-6a84-4d80-aa1d-b8f5230eccce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "address": "fa:16:3e:ba:28:a5", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09d9f919-c2", "ovs_interfaceid": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.812 186666 DEBUG nova.network.os_vif_util [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converting VIF {"id": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "address": "fa:16:3e:ba:28:a5", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09d9f919-c2", "ovs_interfaceid": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.813 186666 DEBUG nova.network.os_vif_util [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:28:a5,bridge_name='br-int',has_traffic_filtering=True,id=09d9f919-c2ab-4c91-896e-88de5d7337d7,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09d9f919-c2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.813 186666 DEBUG os_vif [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:28:a5,bridge_name='br-int',has_traffic_filtering=True,id=09d9f919-c2ab-4c91-896e-88de5d7337d7,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09d9f919-c2') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.814 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.815 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.815 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.816 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.817 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e64e5256-7915-5159-aeee-29b5b0501cdc', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.820 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.823 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.823 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09d9f919-c2, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.824 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap09d9f919-c2, col_values=(('qos', UUID('fd941937-5032-432b-af32-dd7694ab1a4a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.824 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap09d9f919-c2, col_values=(('external_ids', {'iface-id': '09d9f919-c2ab-4c91-896e-88de5d7337d7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:28:a5', 'vm-uuid': 'f844ee8e-6a84-4d80-aa1d-b8f5230eccce'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.825 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:31 compute-0 NetworkManager[56519]: <info>  [1771529611.8271] manager: (tap09d9f919-c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.828 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.831 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:31 compute-0 nova_compute[186662]: 2026-02-19 19:33:31.832 186666 INFO os_vif [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:28:a5,bridge_name='br-int',has_traffic_filtering=True,id=09d9f919-c2ab-4c91-896e-88de5d7337d7,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09d9f919-c2')
Feb 19 19:33:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:32.130 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:33:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:32.131 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:33:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:32.131 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:33:32 compute-0 nova_compute[186662]: 2026-02-19 19:33:32.950 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:33 compute-0 nova_compute[186662]: 2026-02-19 19:33:33.374 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:33:33 compute-0 nova_compute[186662]: 2026-02-19 19:33:33.374 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:33:33 compute-0 nova_compute[186662]: 2026-02-19 19:33:33.375 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] No VIF found with MAC fa:16:3e:ba:28:a5, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:33:33 compute-0 nova_compute[186662]: 2026-02-19 19:33:33.376 186666 INFO nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Using config drive
Feb 19 19:33:33 compute-0 nova_compute[186662]: 2026-02-19 19:33:33.887 186666 WARNING neutronclient.v2_0.client [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:33:34 compute-0 nova_compute[186662]: 2026-02-19 19:33:34.795 186666 INFO nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Creating config drive at /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk.config
Feb 19 19:33:34 compute-0 nova_compute[186662]: 2026-02-19 19:33:34.803 186666 DEBUG oslo_concurrency.processutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp26l_v3s2 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:33:34 compute-0 nova_compute[186662]: 2026-02-19 19:33:34.922 186666 DEBUG oslo_concurrency.processutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp26l_v3s2" returned: 0 in 0.120s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:33:34 compute-0 kernel: tap09d9f919-c2: entered promiscuous mode
Feb 19 19:33:34 compute-0 NetworkManager[56519]: <info>  [1771529614.9761] manager: (tap09d9f919-c2): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.014 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:35 compute-0 systemd-udevd[211758]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:33:35 compute-0 ovn_controller[96653]: 2026-02-19T19:33:35Z|00105|binding|INFO|Claiming lport 09d9f919-c2ab-4c91-896e-88de5d7337d7 for this chassis.
Feb 19 19:33:35 compute-0 ovn_controller[96653]: 2026-02-19T19:33:35Z|00106|binding|INFO|09d9f919-c2ab-4c91-896e-88de5d7337d7: Claiming fa:16:3e:ba:28:a5 10.100.0.3
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.018 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:35 compute-0 NetworkManager[56519]: <info>  [1771529615.0277] device (tap09d9f919-c2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:33:35 compute-0 NetworkManager[56519]: <info>  [1771529615.0284] device (tap09d9f919-c2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:33:35 compute-0 systemd-machined[156014]: New machine qemu-9-instance-0000000d.
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.041 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:28:a5 10.100.0.3'], port_security=['fa:16:3e:ba:28:a5 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f844ee8e-6a84-4d80-aa1d-b8f5230eccce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87043e904d374d2fbc50a010c14c8987', 'neutron:revision_number': '4', 'neutron:security_group_ids': '624104e9-162a-4121-9113-49a4c95099e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a17ccb-74c9-404e-969f-7001a470c860, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=09d9f919-c2ab-4c91-896e-88de5d7337d7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.042 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 09d9f919-c2ab-4c91-896e-88de5d7337d7 in datapath f37b00da-2392-46ae-ac87-2c54ab8961a2 bound to our chassis
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.044 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f37b00da-2392-46ae-ac87-2c54ab8961a2
Feb 19 19:33:35 compute-0 ovn_controller[96653]: 2026-02-19T19:33:35Z|00107|binding|INFO|Setting lport 09d9f919-c2ab-4c91-896e-88de5d7337d7 ovn-installed in OVS
Feb 19 19:33:35 compute-0 ovn_controller[96653]: 2026-02-19T19:33:35Z|00108|binding|INFO|Setting lport 09d9f919-c2ab-4c91-896e-88de5d7337d7 up in Southbound
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.052 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.056 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[57828661-955e-4c82-9229-e3c52b3ff347]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.057 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf37b00da-21 in ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:33:35 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000d.
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.059 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf37b00da-20 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.059 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c84c0fb8-7f95-4d1f-b9a6-d109a76c51c0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.060 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b57e4e60-9e4a-4df1-a113-7582b358d879]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.073 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[22eaa408-5b08-4fff-a08f-9b2e77becc49]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.091 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[725e1b48-af0e-42e3-9ce9-5ca348104344]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.114 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab10697-805d-45b4-bc1b-78d3e1738b7f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.118 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ff33d207-e382-4da0-b360-ce0b95ad2e5a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 systemd-udevd[211762]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:33:35 compute-0 NetworkManager[56519]: <info>  [1771529615.1199] manager: (tapf37b00da-20): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.144 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[990fc12d-7d1f-4b98-87a0-eca614f2eb7e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.146 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[cf99d09f-24f1-47bb-9812-c60e3abeb59c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 NetworkManager[56519]: <info>  [1771529615.1630] device (tapf37b00da-20): carrier: link connected
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.167 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[bda2f3c7-c888-4db8-957b-9d78c1a15831]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.178 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[25ee5af3-b5b1-4aa2-a0d5-af37aaa2ee62]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf37b00da-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:25:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388472, 'reachable_time': 34062, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211794, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.187 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f4f3a455-a47e-47b3-baa7-24113c056365]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedc:2582'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388472, 'tstamp': 388472}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211795, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.198 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c72d44f4-8526-48c8-a7e3-0c5d575fc787]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf37b00da-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:25:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388472, 'reachable_time': 34062, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211796, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.222 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[4c58a265-762c-46dc-91a5-778274fb7283]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.262 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b1dd99fb-1aae-4827-801a-67f71ffbc58c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.263 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf37b00da-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.263 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.263 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf37b00da-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:33:35 compute-0 NetworkManager[56519]: <info>  [1771529615.2658] manager: (tapf37b00da-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.265 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:35 compute-0 kernel: tapf37b00da-20: entered promiscuous mode
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.267 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.268 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf37b00da-20, col_values=(('external_ids', {'iface-id': 'e682361b-89b8-4c67-9593-6b6d57e3096a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.269 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:35 compute-0 ovn_controller[96653]: 2026-02-19T19:33:35Z|00109|binding|INFO|Releasing lport e682361b-89b8-4c67-9593-6b6d57e3096a from this chassis (sb_readonly=0)
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.272 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.275 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.276 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e6416029-5342-4aef-a32c-aedfc46d4dc6]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.277 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.277 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.277 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f37b00da-2392-46ae-ac87-2c54ab8961a2 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.277 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.278 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[9bf685c5-430d-4977-ac35-a7a2a42d37a2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.278 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.278 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[13c4f93c-30e1-44c1-92ff-f8d21e1f02d4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.279 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-f37b00da-2392-46ae-ac87-2c54ab8961a2
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID f37b00da-2392-46ae-ac87-2c54ab8961a2
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:33:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:33:35.279 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'env', 'PROCESS_TAG=haproxy-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f37b00da-2392-46ae-ac87-2c54ab8961a2.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.292 186666 DEBUG nova.compute.manager [req-e2cefb5c-0c65-4bb6-9bcb-b410d54b6efa req-7db748d7-1b1b-4af0-8cf6-488a512c1adb 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Received event network-vif-plugged-09d9f919-c2ab-4c91-896e-88de5d7337d7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.292 186666 DEBUG oslo_concurrency.lockutils [req-e2cefb5c-0c65-4bb6-9bcb-b410d54b6efa req-7db748d7-1b1b-4af0-8cf6-488a512c1adb 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.293 186666 DEBUG oslo_concurrency.lockutils [req-e2cefb5c-0c65-4bb6-9bcb-b410d54b6efa req-7db748d7-1b1b-4af0-8cf6-488a512c1adb 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.293 186666 DEBUG oslo_concurrency.lockutils [req-e2cefb5c-0c65-4bb6-9bcb-b410d54b6efa req-7db748d7-1b1b-4af0-8cf6-488a512c1adb 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.293 186666 DEBUG nova.compute.manager [req-e2cefb5c-0c65-4bb6-9bcb-b410d54b6efa req-7db748d7-1b1b-4af0-8cf6-488a512c1adb 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Processing event network-vif-plugged-09d9f919-c2ab-4c91-896e-88de5d7337d7 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.431 186666 DEBUG nova.compute.manager [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.441 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.444 186666 INFO nova.virt.libvirt.driver [-] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Instance spawned successfully.
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.444 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Feb 19 19:33:35 compute-0 podman[211835]: 2026-02-19 19:33:35.613612217 +0000 UTC m=+0.042090324 container create 075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 19 19:33:35 compute-0 systemd[1]: Started libpod-conmon-075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be.scope.
Feb 19 19:33:35 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:33:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12bed4a781ee87158df8114fc420396a9d9978c0b4881e96d2a76877fe1f11cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:33:35 compute-0 podman[211835]: 2026-02-19 19:33:35.668323422 +0000 UTC m=+0.096801559 container init 075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Feb 19 19:33:35 compute-0 podman[211835]: 2026-02-19 19:33:35.67670432 +0000 UTC m=+0.105182427 container start 075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Feb 19 19:33:35 compute-0 podman[211835]: 2026-02-19 19:33:35.588785871 +0000 UTC m=+0.017263998 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:33:35 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[211851]: [NOTICE]   (211855) : New worker (211857) forked
Feb 19 19:33:35 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[211851]: [NOTICE]   (211855) : Loading success.
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.955 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.955 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.955 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.956 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.956 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:33:35 compute-0 nova_compute[186662]: 2026-02-19 19:33:35.956 186666 DEBUG nova.virt.libvirt.driver [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:33:36 compute-0 nova_compute[186662]: 2026-02-19 19:33:36.464 186666 INFO nova.compute.manager [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Took 9.58 seconds to spawn the instance on the hypervisor.
Feb 19 19:33:36 compute-0 nova_compute[186662]: 2026-02-19 19:33:36.465 186666 DEBUG nova.compute.manager [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:33:36 compute-0 nova_compute[186662]: 2026-02-19 19:33:36.826 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:37 compute-0 nova_compute[186662]: 2026-02-19 19:33:37.000 186666 INFO nova.compute.manager [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Took 14.79 seconds to build instance.
Feb 19 19:33:37 compute-0 podman[211866]: 2026-02-19 19:33:37.273615128 +0000 UTC m=+0.051404274 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216)
Feb 19 19:33:37 compute-0 nova_compute[186662]: 2026-02-19 19:33:37.359 186666 DEBUG nova.compute.manager [req-17a03d59-7732-4bb4-a372-de8dcb11be35 req-cc16bd43-fa8c-4b53-b8c3-2139b0ad12c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Received event network-vif-plugged-09d9f919-c2ab-4c91-896e-88de5d7337d7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:33:37 compute-0 nova_compute[186662]: 2026-02-19 19:33:37.359 186666 DEBUG oslo_concurrency.lockutils [req-17a03d59-7732-4bb4-a372-de8dcb11be35 req-cc16bd43-fa8c-4b53-b8c3-2139b0ad12c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:33:37 compute-0 nova_compute[186662]: 2026-02-19 19:33:37.360 186666 DEBUG oslo_concurrency.lockutils [req-17a03d59-7732-4bb4-a372-de8dcb11be35 req-cc16bd43-fa8c-4b53-b8c3-2139b0ad12c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:33:37 compute-0 nova_compute[186662]: 2026-02-19 19:33:37.360 186666 DEBUG oslo_concurrency.lockutils [req-17a03d59-7732-4bb4-a372-de8dcb11be35 req-cc16bd43-fa8c-4b53-b8c3-2139b0ad12c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:33:37 compute-0 nova_compute[186662]: 2026-02-19 19:33:37.360 186666 DEBUG nova.compute.manager [req-17a03d59-7732-4bb4-a372-de8dcb11be35 req-cc16bd43-fa8c-4b53-b8c3-2139b0ad12c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] No waiting events found dispatching network-vif-plugged-09d9f919-c2ab-4c91-896e-88de5d7337d7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:33:37 compute-0 nova_compute[186662]: 2026-02-19 19:33:37.360 186666 WARNING nova.compute.manager [req-17a03d59-7732-4bb4-a372-de8dcb11be35 req-cc16bd43-fa8c-4b53-b8c3-2139b0ad12c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Received unexpected event network-vif-plugged-09d9f919-c2ab-4c91-896e-88de5d7337d7 for instance with vm_state active and task_state None.
Feb 19 19:33:37 compute-0 nova_compute[186662]: 2026-02-19 19:33:37.505 186666 DEBUG oslo_concurrency.lockutils [None req-c6c0ba7a-72fc-43c4-b6f7-f0ed88140327 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.312s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:33:37 compute-0 nova_compute[186662]: 2026-02-19 19:33:37.995 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:41 compute-0 podman[211887]: 2026-02-19 19:33:41.305481399 +0000 UTC m=+0.079577413 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1770267347, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, version=9.7)
Feb 19 19:33:41 compute-0 nova_compute[186662]: 2026-02-19 19:33:41.829 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:43 compute-0 nova_compute[186662]: 2026-02-19 19:33:43.048 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:43 compute-0 podman[211909]: 2026-02-19 19:33:43.330595347 +0000 UTC m=+0.091368655 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0)
Feb 19 19:33:46 compute-0 nova_compute[186662]: 2026-02-19 19:33:46.832 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:47 compute-0 ovn_controller[96653]: 2026-02-19T19:33:47Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ba:28:a5 10.100.0.3
Feb 19 19:33:47 compute-0 ovn_controller[96653]: 2026-02-19T19:33:47Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ba:28:a5 10.100.0.3
Feb 19 19:33:48 compute-0 nova_compute[186662]: 2026-02-19 19:33:48.051 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:49 compute-0 nova_compute[186662]: 2026-02-19 19:33:49.680 186666 DEBUG nova.virt.libvirt.driver [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Creating tmpfile /var/lib/nova/instances/tmp32_rrgtt to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Feb 19 19:33:49 compute-0 nova_compute[186662]: 2026-02-19 19:33:49.681 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:33:49 compute-0 nova_compute[186662]: 2026-02-19 19:33:49.692 186666 DEBUG nova.compute.manager [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp32_rrgtt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Feb 19 19:33:49 compute-0 podman[211959]: 2026-02-19 19:33:49.753538845 +0000 UTC m=+0.043784016 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:33:51 compute-0 sshd-session[211983]: Invalid user ubuntu from 197.211.55.20 port 54598
Feb 19 19:33:51 compute-0 nova_compute[186662]: 2026-02-19 19:33:51.770 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:33:51 compute-0 nova_compute[186662]: 2026-02-19 19:33:51.834 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:51 compute-0 sshd-session[211983]: Received disconnect from 197.211.55.20 port 54598:11: Bye Bye [preauth]
Feb 19 19:33:51 compute-0 sshd-session[211983]: Disconnected from invalid user ubuntu 197.211.55.20 port 54598 [preauth]
Feb 19 19:33:53 compute-0 nova_compute[186662]: 2026-02-19 19:33:53.054 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:55 compute-0 nova_compute[186662]: 2026-02-19 19:33:55.666 186666 DEBUG nova.compute.manager [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp32_rrgtt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='975e14fd-3f0a-458d-81e0-ff0cd2344d14',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Feb 19 19:33:56 compute-0 nova_compute[186662]: 2026-02-19 19:33:56.682 186666 DEBUG oslo_concurrency.lockutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-975e14fd-3f0a-458d-81e0-ff0cd2344d14" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:33:56 compute-0 nova_compute[186662]: 2026-02-19 19:33:56.683 186666 DEBUG oslo_concurrency.lockutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-975e14fd-3f0a-458d-81e0-ff0cd2344d14" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:33:56 compute-0 nova_compute[186662]: 2026-02-19 19:33:56.683 186666 DEBUG nova.network.neutron [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:33:56 compute-0 nova_compute[186662]: 2026-02-19 19:33:56.836 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:57 compute-0 nova_compute[186662]: 2026-02-19 19:33:57.190 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:33:58 compute-0 nova_compute[186662]: 2026-02-19 19:33:58.038 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:33:58 compute-0 nova_compute[186662]: 2026-02-19 19:33:58.054 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:33:58 compute-0 nova_compute[186662]: 2026-02-19 19:33:58.240 186666 DEBUG nova.network.neutron [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Updating instance_info_cache with network_info: [{"id": "5248d484-2946-4983-9716-18e4eed7c94c", "address": "fa:16:3e:1f:b0:ef", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5248d484-29", "ovs_interfaceid": "5248d484-2946-4983-9716-18e4eed7c94c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:33:58 compute-0 nova_compute[186662]: 2026-02-19 19:33:58.753 186666 DEBUG oslo_concurrency.lockutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-975e14fd-3f0a-458d-81e0-ff0cd2344d14" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:33:58 compute-0 nova_compute[186662]: 2026-02-19 19:33:58.769 186666 DEBUG nova.virt.libvirt.driver [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp32_rrgtt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='975e14fd-3f0a-458d-81e0-ff0cd2344d14',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Feb 19 19:33:58 compute-0 nova_compute[186662]: 2026-02-19 19:33:58.770 186666 DEBUG nova.virt.libvirt.driver [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Creating instance directory: /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Feb 19 19:33:58 compute-0 nova_compute[186662]: 2026-02-19 19:33:58.771 186666 DEBUG nova.virt.libvirt.driver [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Creating disk.info with the contents: {'/var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk': 'qcow2', '/var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Feb 19 19:33:58 compute-0 nova_compute[186662]: 2026-02-19 19:33:58.771 186666 DEBUG nova.virt.libvirt.driver [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Feb 19 19:33:58 compute-0 nova_compute[186662]: 2026-02-19 19:33:58.772 186666 DEBUG nova.objects.instance [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 975e14fd-3f0a-458d-81e0-ff0cd2344d14 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.279 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.286 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.288 186666 DEBUG oslo_concurrency.processutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.373 186666 DEBUG oslo_concurrency.processutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.374 186666 DEBUG oslo_concurrency.lockutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.375 186666 DEBUG oslo_concurrency.lockutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.376 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.380 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.380 186666 DEBUG oslo_concurrency.processutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.432 186666 DEBUG oslo_concurrency.processutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.434 186666 DEBUG oslo_concurrency.processutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.459 186666 DEBUG oslo_concurrency.processutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk 1073741824" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.460 186666 DEBUG oslo_concurrency.lockutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.085s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.461 186666 DEBUG oslo_concurrency.processutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.502 186666 DEBUG oslo_concurrency.processutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.503 186666 DEBUG nova.virt.disk.api [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Checking if we can resize image /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.504 186666 DEBUG oslo_concurrency.processutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.547 186666 DEBUG oslo_concurrency.processutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.547 186666 DEBUG nova.virt.disk.api [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Cannot resize image /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:33:59 compute-0 nova_compute[186662]: 2026-02-19 19:33:59.548 186666 DEBUG nova.objects.instance [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid 975e14fd-3f0a-458d-81e0-ff0cd2344d14 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:33:59 compute-0 podman[196025]: time="2026-02-19T19:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:33:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:33:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2654 "" "Go-http-client/1.1"
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.058 186666 DEBUG nova.objects.base [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<975e14fd-3f0a-458d-81e0-ff0cd2344d14> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.058 186666 DEBUG oslo_concurrency.processutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.074 186666 DEBUG oslo_concurrency.processutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk.config 497664" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.075 186666 DEBUG nova.virt.libvirt.driver [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.076 186666 DEBUG nova.virt.libvirt.vif [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-02-19T19:33:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1847271986',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1847271986',id=12,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:33:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='87043e904d374d2fbc50a010c14c8987',ramdisk_id='',reservation_id='r-fuu10k82',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-488758453',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:33:15Z,user_data=None,user_id='f74ba8e1becb4d8f83bb148785aac310',uuid=975e14fd-3f0a-458d-81e0-ff0cd2344d14,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5248d484-2946-4983-9716-18e4eed7c94c", "address": "fa:16:3e:1f:b0:ef", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5248d484-29", "ovs_interfaceid": "5248d484-2946-4983-9716-18e4eed7c94c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.077 186666 DEBUG nova.network.os_vif_util [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "5248d484-2946-4983-9716-18e4eed7c94c", "address": "fa:16:3e:1f:b0:ef", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5248d484-29", "ovs_interfaceid": "5248d484-2946-4983-9716-18e4eed7c94c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.077 186666 DEBUG nova.network.os_vif_util [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:b0:ef,bridge_name='br-int',has_traffic_filtering=True,id=5248d484-2946-4983-9716-18e4eed7c94c,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5248d484-29') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.078 186666 DEBUG os_vif [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:b0:ef,bridge_name='br-int',has_traffic_filtering=True,id=5248d484-2946-4983-9716-18e4eed7c94c,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5248d484-29') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.079 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.079 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.079 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.080 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.081 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '2b2db050-5e5d-5005-bdc7-1ceaa2090059', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.082 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.084 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.088 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.088 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5248d484-29, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.089 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap5248d484-29, col_values=(('qos', UUID('637dcbc4-bb68-47d8-b8fa-59bcffc3bb80')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.090 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap5248d484-29, col_values=(('external_ids', {'iface-id': '5248d484-2946-4983-9716-18e4eed7c94c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:b0:ef', 'vm-uuid': '975e14fd-3f0a-458d-81e0-ff0cd2344d14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.091 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:00 compute-0 NetworkManager[56519]: <info>  [1771529640.0921] manager: (tap5248d484-29): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.094 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.096 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.097 186666 INFO os_vif [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:b0:ef,bridge_name='br-int',has_traffic_filtering=True,id=5248d484-2946-4983-9716-18e4eed7c94c,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5248d484-29')
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.098 186666 DEBUG nova.virt.libvirt.driver [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.098 186666 DEBUG nova.compute.manager [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp32_rrgtt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='975e14fd-3f0a-458d-81e0-ff0cd2344d14',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.099 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.361 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:00.618 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:34:00 compute-0 nova_compute[186662]: 2026-02-19 19:34:00.618 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:00.619 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:34:00 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:00.622 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:01 compute-0 nova_compute[186662]: 2026-02-19 19:34:01.019 186666 DEBUG nova.network.neutron [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Port 5248d484-2946-4983-9716-18e4eed7c94c updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Feb 19 19:34:01 compute-0 nova_compute[186662]: 2026-02-19 19:34:01.033 186666 DEBUG nova.compute.manager [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp32_rrgtt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='975e14fd-3f0a-458d-81e0-ff0cd2344d14',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Feb 19 19:34:01 compute-0 openstack_network_exporter[198916]: ERROR   19:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:34:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:34:01 compute-0 openstack_network_exporter[198916]: ERROR   19:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:34:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:34:03 compute-0 nova_compute[186662]: 2026-02-19 19:34:03.055 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:04 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 19 19:34:04 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 19 19:34:04 compute-0 kernel: tap5248d484-29: entered promiscuous mode
Feb 19 19:34:04 compute-0 NetworkManager[56519]: <info>  [1771529644.8874] manager: (tap5248d484-29): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Feb 19 19:34:04 compute-0 ovn_controller[96653]: 2026-02-19T19:34:04Z|00110|binding|INFO|Claiming lport 5248d484-2946-4983-9716-18e4eed7c94c for this additional chassis.
Feb 19 19:34:04 compute-0 nova_compute[186662]: 2026-02-19 19:34:04.889 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:04 compute-0 ovn_controller[96653]: 2026-02-19T19:34:04Z|00111|binding|INFO|5248d484-2946-4983-9716-18e4eed7c94c: Claiming fa:16:3e:1f:b0:ef 10.100.0.9
Feb 19 19:34:04 compute-0 ovn_controller[96653]: 2026-02-19T19:34:04Z|00112|binding|INFO|Setting lport 5248d484-2946-4983-9716-18e4eed7c94c ovn-installed in OVS
Feb 19 19:34:04 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:04.897 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:b0:ef 10.100.0.9'], port_security=['fa:16:3e:1f:b0:ef 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '975e14fd-3f0a-458d-81e0-ff0cd2344d14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87043e904d374d2fbc50a010c14c8987', 'neutron:revision_number': '10', 'neutron:security_group_ids': '624104e9-162a-4121-9113-49a4c95099e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a17ccb-74c9-404e-969f-7001a470c860, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=5248d484-2946-4983-9716-18e4eed7c94c) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:34:04 compute-0 nova_compute[186662]: 2026-02-19 19:34:04.898 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:04 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:04.898 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 5248d484-2946-4983-9716-18e4eed7c94c in datapath f37b00da-2392-46ae-ac87-2c54ab8961a2 unbound from our chassis
Feb 19 19:34:04 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:04.900 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f37b00da-2392-46ae-ac87-2c54ab8961a2
Feb 19 19:34:04 compute-0 nova_compute[186662]: 2026-02-19 19:34:04.900 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:04 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:04.913 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[55b36216-32a7-45eb-ab17-fbe91e92a8ba]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:04 compute-0 systemd-udevd[212042]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:34:04 compute-0 systemd-machined[156014]: New machine qemu-10-instance-0000000c.
Feb 19 19:34:04 compute-0 NetworkManager[56519]: <info>  [1771529644.9259] device (tap5248d484-29): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:34:04 compute-0 NetworkManager[56519]: <info>  [1771529644.9265] device (tap5248d484-29): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:34:04 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000c.
Feb 19 19:34:04 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:04.942 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[93b670a0-9344-4d5b-bbd1-f16a58edbfdb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:04 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:04.944 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[c73aa254-09bb-443a-82ab-c877b4a51071]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:04 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:04.967 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[b1891594-e210-458f-9904-35011109e03c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:04 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:04.977 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c62cd80c-35ed-4168-a226-ce6df8468d5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf37b00da-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:25:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388472, 'reachable_time': 34062, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212055, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:04 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:04.991 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd15f52-cd5d-4eb8-9d9b-c31c69484eac]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf37b00da-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388480, 'tstamp': 388480}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212056, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf37b00da-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388482, 'tstamp': 388482}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212056, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:04 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:04.992 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf37b00da-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:05 compute-0 nova_compute[186662]: 2026-02-19 19:34:05.033 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:05 compute-0 nova_compute[186662]: 2026-02-19 19:34:05.034 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:05.034 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf37b00da-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:05.035 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:34:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:05.035 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf37b00da-20, col_values=(('external_ids', {'iface-id': 'e682361b-89b8-4c67-9593-6b6d57e3096a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:05.035 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:34:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:05.036 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c07e509e-4128-4223-8dd8-5a6d4612a1b1]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f37b00da-2392-46ae-ac87-2c54ab8961a2\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f37b00da-2392-46ae-ac87-2c54ab8961a2\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:05 compute-0 nova_compute[186662]: 2026-02-19 19:34:05.092 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:05 compute-0 sshd-session[212006]: Received disconnect from 103.67.78.251 port 33908:11: Bye Bye [preauth]
Feb 19 19:34:05 compute-0 sshd-session[212006]: Disconnected from authenticating user root 103.67.78.251 port 33908 [preauth]
Feb 19 19:34:08 compute-0 nova_compute[186662]: 2026-02-19 19:34:08.091 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:08 compute-0 podman[212076]: 2026-02-19 19:34:08.28923805 +0000 UTC m=+0.061615498 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:34:08 compute-0 ovn_controller[96653]: 2026-02-19T19:34:08Z|00113|binding|INFO|Claiming lport 5248d484-2946-4983-9716-18e4eed7c94c for this chassis.
Feb 19 19:34:08 compute-0 ovn_controller[96653]: 2026-02-19T19:34:08Z|00114|binding|INFO|5248d484-2946-4983-9716-18e4eed7c94c: Claiming fa:16:3e:1f:b0:ef 10.100.0.9
Feb 19 19:34:08 compute-0 ovn_controller[96653]: 2026-02-19T19:34:08Z|00115|binding|INFO|Setting lport 5248d484-2946-4983-9716-18e4eed7c94c up in Southbound
Feb 19 19:34:10 compute-0 nova_compute[186662]: 2026-02-19 19:34:10.079 186666 INFO nova.compute.manager [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Post operation of migration started
Feb 19 19:34:10 compute-0 nova_compute[186662]: 2026-02-19 19:34:10.080 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:10 compute-0 nova_compute[186662]: 2026-02-19 19:34:10.093 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:10 compute-0 nova_compute[186662]: 2026-02-19 19:34:10.385 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:10 compute-0 nova_compute[186662]: 2026-02-19 19:34:10.386 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:10 compute-0 nova_compute[186662]: 2026-02-19 19:34:10.609 186666 DEBUG oslo_concurrency.lockutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-975e14fd-3f0a-458d-81e0-ff0cd2344d14" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:34:10 compute-0 nova_compute[186662]: 2026-02-19 19:34:10.610 186666 DEBUG oslo_concurrency.lockutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-975e14fd-3f0a-458d-81e0-ff0cd2344d14" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:34:10 compute-0 nova_compute[186662]: 2026-02-19 19:34:10.611 186666 DEBUG nova.network.neutron [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:34:11 compute-0 nova_compute[186662]: 2026-02-19 19:34:11.134 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:11 compute-0 nova_compute[186662]: 2026-02-19 19:34:11.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:34:11 compute-0 nova_compute[186662]: 2026-02-19 19:34:11.990 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:12 compute-0 nova_compute[186662]: 2026-02-19 19:34:12.129 186666 DEBUG nova.network.neutron [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Updating instance_info_cache with network_info: [{"id": "5248d484-2946-4983-9716-18e4eed7c94c", "address": "fa:16:3e:1f:b0:ef", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5248d484-29", "ovs_interfaceid": "5248d484-2946-4983-9716-18e4eed7c94c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:34:12 compute-0 podman[212097]: 2026-02-19 19:34:12.289095358 +0000 UTC m=+0.063002983 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Feb 19 19:34:12 compute-0 nova_compute[186662]: 2026-02-19 19:34:12.638 186666 DEBUG oslo_concurrency.lockutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-975e14fd-3f0a-458d-81e0-ff0cd2344d14" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:34:13 compute-0 nova_compute[186662]: 2026-02-19 19:34:13.136 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:13 compute-0 nova_compute[186662]: 2026-02-19 19:34:13.158 186666 DEBUG oslo_concurrency.lockutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:13 compute-0 nova_compute[186662]: 2026-02-19 19:34:13.159 186666 DEBUG oslo_concurrency.lockutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:13 compute-0 nova_compute[186662]: 2026-02-19 19:34:13.159 186666 DEBUG oslo_concurrency.lockutils [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:13 compute-0 nova_compute[186662]: 2026-02-19 19:34:13.164 186666 INFO nova.virt.libvirt.driver [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 19 19:34:13 compute-0 virtqemud[186157]: Domain id=10 name='instance-0000000c' uuid=975e14fd-3f0a-458d-81e0-ff0cd2344d14 is tainted: custom-monitor
Feb 19 19:34:13 compute-0 nova_compute[186662]: 2026-02-19 19:34:13.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:34:13 compute-0 nova_compute[186662]: 2026-02-19 19:34:13.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:34:14 compute-0 nova_compute[186662]: 2026-02-19 19:34:14.170 186666 INFO nova.virt.libvirt.driver [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 19 19:34:14 compute-0 podman[212119]: 2026-02-19 19:34:14.348344961 +0000 UTC m=+0.116846977 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Feb 19 19:34:14 compute-0 nova_compute[186662]: 2026-02-19 19:34:14.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:34:15 compute-0 nova_compute[186662]: 2026-02-19 19:34:15.095 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:15 compute-0 nova_compute[186662]: 2026-02-19 19:34:15.176 186666 INFO nova.virt.libvirt.driver [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 19 19:34:15 compute-0 nova_compute[186662]: 2026-02-19 19:34:15.181 186666 DEBUG nova.compute.manager [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:34:15 compute-0 nova_compute[186662]: 2026-02-19 19:34:15.690 186666 DEBUG nova.objects.instance [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Feb 19 19:34:16 compute-0 nova_compute[186662]: 2026-02-19 19:34:16.709 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:17 compute-0 nova_compute[186662]: 2026-02-19 19:34:17.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:34:17 compute-0 nova_compute[186662]: 2026-02-19 19:34:17.691 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:17 compute-0 nova_compute[186662]: 2026-02-19 19:34:17.692 186666 WARNING neutronclient.v2_0.client [None req-fd4d043e-b904-4193-9e67-ac839500522d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:18 compute-0 nova_compute[186662]: 2026-02-19 19:34:18.138 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:18 compute-0 nova_compute[186662]: 2026-02-19 19:34:18.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:34:19 compute-0 nova_compute[186662]: 2026-02-19 19:34:19.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:34:19 compute-0 nova_compute[186662]: 2026-02-19 19:34:19.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:34:19 compute-0 nova_compute[186662]: 2026-02-19 19:34:19.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:34:20 compute-0 nova_compute[186662]: 2026-02-19 19:34:20.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:20 compute-0 nova_compute[186662]: 2026-02-19 19:34:20.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:20 compute-0 nova_compute[186662]: 2026-02-19 19:34:20.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:20 compute-0 nova_compute[186662]: 2026-02-19 19:34:20.087 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:34:20 compute-0 nova_compute[186662]: 2026-02-19 19:34:20.097 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:20 compute-0 podman[212146]: 2026-02-19 19:34:20.191338838 +0000 UTC m=+0.060095520 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.132 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.200 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.201 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.270 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.275 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.314 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.315 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.370 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.499 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.501 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.512 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.513 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5515MB free_disk=72.91897201538086GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.513 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:21 compute-0 nova_compute[186662]: 2026-02-19 19:34:21.513 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:22 compute-0 nova_compute[186662]: 2026-02-19 19:34:22.532 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Applying migration context for instance 975e14fd-3f0a-458d-81e0-ff0cd2344d14 as it has an incoming, in-progress migration 7eb7ca43-0f56-4003-8243-eaa028ab8a91. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Feb 19 19:34:22 compute-0 nova_compute[186662]: 2026-02-19 19:34:22.532 186666 DEBUG nova.objects.instance [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Feb 19 19:34:23 compute-0 nova_compute[186662]: 2026-02-19 19:34:23.040 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Feb 19 19:34:23 compute-0 nova_compute[186662]: 2026-02-19 19:34:23.067 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance f844ee8e-6a84-4d80-aa1d-b8f5230eccce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:34:23 compute-0 nova_compute[186662]: 2026-02-19 19:34:23.068 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 975e14fd-3f0a-458d-81e0-ff0cd2344d14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:34:23 compute-0 nova_compute[186662]: 2026-02-19 19:34:23.068 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:34:23 compute-0 nova_compute[186662]: 2026-02-19 19:34:23.069 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:34:21 up  1:05,  0 user,  load average: 0.58, 0.38, 0.37\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_87043e904d374d2fbc50a010c14c8987': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:34:23 compute-0 nova_compute[186662]: 2026-02-19 19:34:23.086 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing inventories for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Feb 19 19:34:23 compute-0 nova_compute[186662]: 2026-02-19 19:34:23.102 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating ProviderTree inventory for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Feb 19 19:34:23 compute-0 nova_compute[186662]: 2026-02-19 19:34:23.102 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:34:23 compute-0 nova_compute[186662]: 2026-02-19 19:34:23.115 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing aggregate associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Feb 19 19:34:23 compute-0 nova_compute[186662]: 2026-02-19 19:34:23.133 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing trait associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_SSE2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,HW_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_USB _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Feb 19 19:34:23 compute-0 nova_compute[186662]: 2026-02-19 19:34:23.179 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:23 compute-0 nova_compute[186662]: 2026-02-19 19:34:23.210 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:34:23 compute-0 nova_compute[186662]: 2026-02-19 19:34:23.720 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:34:24 compute-0 nova_compute[186662]: 2026-02-19 19:34:24.231 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:34:24 compute-0 nova_compute[186662]: 2026-02-19 19:34:24.232 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.718s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:25 compute-0 nova_compute[186662]: 2026-02-19 19:34:25.099 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:28 compute-0 nova_compute[186662]: 2026-02-19 19:34:28.178 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:28 compute-0 nova_compute[186662]: 2026-02-19 19:34:28.612 186666 DEBUG oslo_concurrency.lockutils [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:28 compute-0 nova_compute[186662]: 2026-02-19 19:34:28.613 186666 DEBUG oslo_concurrency.lockutils [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:28 compute-0 nova_compute[186662]: 2026-02-19 19:34:28.613 186666 DEBUG oslo_concurrency.lockutils [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:28 compute-0 nova_compute[186662]: 2026-02-19 19:34:28.614 186666 DEBUG oslo_concurrency.lockutils [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:28 compute-0 nova_compute[186662]: 2026-02-19 19:34:28.614 186666 DEBUG oslo_concurrency.lockutils [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:28 compute-0 nova_compute[186662]: 2026-02-19 19:34:28.625 186666 INFO nova.compute.manager [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Terminating instance
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.138 186666 DEBUG nova.compute.manager [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:34:29 compute-0 kernel: tap09d9f919-c2 (unregistering): left promiscuous mode
Feb 19 19:34:29 compute-0 NetworkManager[56519]: <info>  [1771529669.1790] device (tap09d9f919-c2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:34:29 compute-0 ovn_controller[96653]: 2026-02-19T19:34:29Z|00116|binding|INFO|Releasing lport 09d9f919-c2ab-4c91-896e-88de5d7337d7 from this chassis (sb_readonly=0)
Feb 19 19:34:29 compute-0 ovn_controller[96653]: 2026-02-19T19:34:29Z|00117|binding|INFO|Setting lport 09d9f919-c2ab-4c91-896e-88de5d7337d7 down in Southbound
Feb 19 19:34:29 compute-0 ovn_controller[96653]: 2026-02-19T19:34:29Z|00118|binding|INFO|Removing iface tap09d9f919-c2 ovn-installed in OVS
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.219 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.221 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.228 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.229 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:28:a5 10.100.0.3'], port_security=['fa:16:3e:ba:28:a5 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f844ee8e-6a84-4d80-aa1d-b8f5230eccce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87043e904d374d2fbc50a010c14c8987', 'neutron:revision_number': '5', 'neutron:security_group_ids': '624104e9-162a-4121-9113-49a4c95099e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a17ccb-74c9-404e-969f-7001a470c860, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=09d9f919-c2ab-4c91-896e-88de5d7337d7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.230 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 09d9f919-c2ab-4c91-896e-88de5d7337d7 in datapath f37b00da-2392-46ae-ac87-2c54ab8961a2 unbound from our chassis
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.232 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f37b00da-2392-46ae-ac87-2c54ab8961a2
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.245 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[58958054-0b22-4955-afac-146ff22c54b7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:29 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Feb 19 19:34:29 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Consumed 13.631s CPU time.
Feb 19 19:34:29 compute-0 systemd-machined[156014]: Machine qemu-9-instance-0000000d terminated.
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.266 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[7ece1c1a-81d3-46ef-bc4c-5948f14aa65d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.268 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[486d55f7-e19d-4712-8fd2-1ab305ea49b1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.287 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[331c8d98-59c6-4b24-84f7-ebc8d1eefd24]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.301 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[0938e8ce-b412-4031-8562-e52e04f72c27]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf37b00da-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:25:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388472, 'reachable_time': 34062, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212197, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.310 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[df66b229-84ef-4e9f-8fea-d4bce789f3a0]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf37b00da-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388480, 'tstamp': 388480}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212198, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf37b00da-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388482, 'tstamp': 388482}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212198, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.312 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf37b00da-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.313 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.316 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.317 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf37b00da-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.317 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.318 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf37b00da-20, col_values=(('external_ids', {'iface-id': 'e682361b-89b8-4c67-9593-6b6d57e3096a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.318 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:34:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:29.319 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[cd69e86f-8dbb-4212-9607-11a0cd1fb4ba]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f37b00da-2392-46ae-ac87-2c54ab8961a2\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f37b00da-2392-46ae-ac87-2c54ab8961a2\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.349 186666 DEBUG nova.compute.manager [req-116f5aa7-fd5a-4ea6-a2fa-4a646dfd6d96 req-fdc8dba5-d0b3-4c96-ab5c-bee98eead9c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Received event network-vif-unplugged-09d9f919-c2ab-4c91-896e-88de5d7337d7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.349 186666 DEBUG oslo_concurrency.lockutils [req-116f5aa7-fd5a-4ea6-a2fa-4a646dfd6d96 req-fdc8dba5-d0b3-4c96-ab5c-bee98eead9c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.349 186666 DEBUG oslo_concurrency.lockutils [req-116f5aa7-fd5a-4ea6-a2fa-4a646dfd6d96 req-fdc8dba5-d0b3-4c96-ab5c-bee98eead9c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.349 186666 DEBUG oslo_concurrency.lockutils [req-116f5aa7-fd5a-4ea6-a2fa-4a646dfd6d96 req-fdc8dba5-d0b3-4c96-ab5c-bee98eead9c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.349 186666 DEBUG nova.compute.manager [req-116f5aa7-fd5a-4ea6-a2fa-4a646dfd6d96 req-fdc8dba5-d0b3-4c96-ab5c-bee98eead9c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] No waiting events found dispatching network-vif-unplugged-09d9f919-c2ab-4c91-896e-88de5d7337d7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.350 186666 DEBUG nova.compute.manager [req-116f5aa7-fd5a-4ea6-a2fa-4a646dfd6d96 req-fdc8dba5-d0b3-4c96-ab5c-bee98eead9c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Received event network-vif-unplugged-09d9f919-c2ab-4c91-896e-88de5d7337d7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.388 186666 INFO nova.virt.libvirt.driver [-] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Instance destroyed successfully.
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.389 186666 DEBUG nova.objects.instance [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lazy-loading 'resources' on Instance uuid f844ee8e-6a84-4d80-aa1d-b8f5230eccce obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:34:29 compute-0 podman[196025]: time="2026-02-19T19:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:34:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:34:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2660 "" "Go-http-client/1.1"
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.894 186666 DEBUG nova.virt.libvirt.vif [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:33:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1003432610',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1003432610',id=13,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:33:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='87043e904d374d2fbc50a010c14c8987',ramdisk_id='',reservation_id='r-lqs1x8kd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-488758453',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:33:36Z,user_data=None,user_id='f74ba8e1becb4d8f83bb148785aac310',uuid=f844ee8e-6a84-4d80-aa1d-b8f5230eccce,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "address": "fa:16:3e:ba:28:a5", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09d9f919-c2", "ovs_interfaceid": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.895 186666 DEBUG nova.network.os_vif_util [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converting VIF {"id": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "address": "fa:16:3e:ba:28:a5", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09d9f919-c2", "ovs_interfaceid": "09d9f919-c2ab-4c91-896e-88de5d7337d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.895 186666 DEBUG nova.network.os_vif_util [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:28:a5,bridge_name='br-int',has_traffic_filtering=True,id=09d9f919-c2ab-4c91-896e-88de5d7337d7,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09d9f919-c2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.896 186666 DEBUG os_vif [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:28:a5,bridge_name='br-int',has_traffic_filtering=True,id=09d9f919-c2ab-4c91-896e-88de5d7337d7,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09d9f919-c2') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.897 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.897 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09d9f919-c2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.899 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.900 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.901 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.901 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=fd941937-5032-432b-af32-dd7694ab1a4a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.902 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.903 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.905 186666 INFO os_vif [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:28:a5,bridge_name='br-int',has_traffic_filtering=True,id=09d9f919-c2ab-4c91-896e-88de5d7337d7,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09d9f919-c2')
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.906 186666 INFO nova.virt.libvirt.driver [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Deleting instance files /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce_del
Feb 19 19:34:29 compute-0 nova_compute[186662]: 2026-02-19 19:34:29.906 186666 INFO nova.virt.libvirt.driver [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Deletion of /var/lib/nova/instances/f844ee8e-6a84-4d80-aa1d-b8f5230eccce_del complete
Feb 19 19:34:30 compute-0 nova_compute[186662]: 2026-02-19 19:34:30.417 186666 INFO nova.compute.manager [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Took 1.28 seconds to destroy the instance on the hypervisor.
Feb 19 19:34:30 compute-0 nova_compute[186662]: 2026-02-19 19:34:30.418 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:34:30 compute-0 nova_compute[186662]: 2026-02-19 19:34:30.418 186666 DEBUG nova.compute.manager [-] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:34:30 compute-0 nova_compute[186662]: 2026-02-19 19:34:30.418 186666 DEBUG nova.network.neutron [-] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:34:30 compute-0 nova_compute[186662]: 2026-02-19 19:34:30.418 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:30 compute-0 nova_compute[186662]: 2026-02-19 19:34:30.678 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:31 compute-0 openstack_network_exporter[198916]: ERROR   19:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:34:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:34:31 compute-0 openstack_network_exporter[198916]: ERROR   19:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:34:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:34:31 compute-0 nova_compute[186662]: 2026-02-19 19:34:31.427 186666 DEBUG nova.network.neutron [-] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:34:31 compute-0 nova_compute[186662]: 2026-02-19 19:34:31.431 186666 DEBUG nova.compute.manager [req-e9655fd9-4761-4dd9-84b3-26128f5901da req-967f422c-866f-4796-9a76-3de429258df0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Received event network-vif-unplugged-09d9f919-c2ab-4c91-896e-88de5d7337d7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:34:31 compute-0 nova_compute[186662]: 2026-02-19 19:34:31.431 186666 DEBUG oslo_concurrency.lockutils [req-e9655fd9-4761-4dd9-84b3-26128f5901da req-967f422c-866f-4796-9a76-3de429258df0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:31 compute-0 nova_compute[186662]: 2026-02-19 19:34:31.431 186666 DEBUG oslo_concurrency.lockutils [req-e9655fd9-4761-4dd9-84b3-26128f5901da req-967f422c-866f-4796-9a76-3de429258df0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:31 compute-0 nova_compute[186662]: 2026-02-19 19:34:31.431 186666 DEBUG oslo_concurrency.lockutils [req-e9655fd9-4761-4dd9-84b3-26128f5901da req-967f422c-866f-4796-9a76-3de429258df0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:31 compute-0 nova_compute[186662]: 2026-02-19 19:34:31.432 186666 DEBUG nova.compute.manager [req-e9655fd9-4761-4dd9-84b3-26128f5901da req-967f422c-866f-4796-9a76-3de429258df0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] No waiting events found dispatching network-vif-unplugged-09d9f919-c2ab-4c91-896e-88de5d7337d7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:34:31 compute-0 nova_compute[186662]: 2026-02-19 19:34:31.432 186666 DEBUG nova.compute.manager [req-e9655fd9-4761-4dd9-84b3-26128f5901da req-967f422c-866f-4796-9a76-3de429258df0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Received event network-vif-unplugged-09d9f919-c2ab-4c91-896e-88de5d7337d7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:34:31 compute-0 nova_compute[186662]: 2026-02-19 19:34:31.432 186666 DEBUG nova.compute.manager [req-e9655fd9-4761-4dd9-84b3-26128f5901da req-967f422c-866f-4796-9a76-3de429258df0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Received event network-vif-deleted-09d9f919-c2ab-4c91-896e-88de5d7337d7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:34:31 compute-0 nova_compute[186662]: 2026-02-19 19:34:31.432 186666 INFO nova.compute.manager [req-e9655fd9-4761-4dd9-84b3-26128f5901da req-967f422c-866f-4796-9a76-3de429258df0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Neutron deleted interface 09d9f919-c2ab-4c91-896e-88de5d7337d7; detaching it from the instance and deleting it from the info cache
Feb 19 19:34:31 compute-0 nova_compute[186662]: 2026-02-19 19:34:31.433 186666 DEBUG nova.network.neutron [req-e9655fd9-4761-4dd9-84b3-26128f5901da req-967f422c-866f-4796-9a76-3de429258df0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:34:31 compute-0 nova_compute[186662]: 2026-02-19 19:34:31.937 186666 INFO nova.compute.manager [-] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Took 1.52 seconds to deallocate network for instance.
Feb 19 19:34:31 compute-0 nova_compute[186662]: 2026-02-19 19:34:31.941 186666 DEBUG nova.compute.manager [req-e9655fd9-4761-4dd9-84b3-26128f5901da req-967f422c-866f-4796-9a76-3de429258df0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: f844ee8e-6a84-4d80-aa1d-b8f5230eccce] Detach interface failed, port_id=09d9f919-c2ab-4c91-896e-88de5d7337d7, reason: Instance f844ee8e-6a84-4d80-aa1d-b8f5230eccce could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:34:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:32.132 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:32.133 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:32.133 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:32 compute-0 nova_compute[186662]: 2026-02-19 19:34:32.461 186666 DEBUG oslo_concurrency.lockutils [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:32 compute-0 nova_compute[186662]: 2026-02-19 19:34:32.462 186666 DEBUG oslo_concurrency.lockutils [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:32 compute-0 nova_compute[186662]: 2026-02-19 19:34:32.529 186666 DEBUG nova.compute.provider_tree [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:34:33 compute-0 nova_compute[186662]: 2026-02-19 19:34:33.036 186666 DEBUG nova.scheduler.client.report [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:34:33 compute-0 nova_compute[186662]: 2026-02-19 19:34:33.231 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:33 compute-0 nova_compute[186662]: 2026-02-19 19:34:33.545 186666 DEBUG oslo_concurrency.lockutils [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.083s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:33 compute-0 nova_compute[186662]: 2026-02-19 19:34:33.567 186666 INFO nova.scheduler.client.report [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Deleted allocations for instance f844ee8e-6a84-4d80-aa1d-b8f5230eccce
Feb 19 19:34:34 compute-0 nova_compute[186662]: 2026-02-19 19:34:34.598 186666 DEBUG oslo_concurrency.lockutils [None req-54ec58ef-0555-4f3c-bebd-c9b29d47e161 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "f844ee8e-6a84-4d80-aa1d-b8f5230eccce" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.985s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:34 compute-0 nova_compute[186662]: 2026-02-19 19:34:34.902 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:36 compute-0 nova_compute[186662]: 2026-02-19 19:34:36.460 186666 DEBUG oslo_concurrency.lockutils [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "975e14fd-3f0a-458d-81e0-ff0cd2344d14" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:36 compute-0 nova_compute[186662]: 2026-02-19 19:34:36.460 186666 DEBUG oslo_concurrency.lockutils [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "975e14fd-3f0a-458d-81e0-ff0cd2344d14" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:36 compute-0 nova_compute[186662]: 2026-02-19 19:34:36.461 186666 DEBUG oslo_concurrency.lockutils [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "975e14fd-3f0a-458d-81e0-ff0cd2344d14-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:36 compute-0 nova_compute[186662]: 2026-02-19 19:34:36.461 186666 DEBUG oslo_concurrency.lockutils [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "975e14fd-3f0a-458d-81e0-ff0cd2344d14-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:36 compute-0 nova_compute[186662]: 2026-02-19 19:34:36.461 186666 DEBUG oslo_concurrency.lockutils [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "975e14fd-3f0a-458d-81e0-ff0cd2344d14-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:36 compute-0 nova_compute[186662]: 2026-02-19 19:34:36.471 186666 INFO nova.compute.manager [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Terminating instance
Feb 19 19:34:36 compute-0 nova_compute[186662]: 2026-02-19 19:34:36.986 186666 DEBUG nova.compute.manager [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:34:37 compute-0 kernel: tap5248d484-29 (unregistering): left promiscuous mode
Feb 19 19:34:37 compute-0 NetworkManager[56519]: <info>  [1771529677.0131] device (tap5248d484-29): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:34:37 compute-0 ovn_controller[96653]: 2026-02-19T19:34:37Z|00119|binding|INFO|Releasing lport 5248d484-2946-4983-9716-18e4eed7c94c from this chassis (sb_readonly=0)
Feb 19 19:34:37 compute-0 ovn_controller[96653]: 2026-02-19T19:34:37Z|00120|binding|INFO|Setting lport 5248d484-2946-4983-9716-18e4eed7c94c down in Southbound
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.018 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:37 compute-0 ovn_controller[96653]: 2026-02-19T19:34:37Z|00121|binding|INFO|Removing iface tap5248d484-29 ovn-installed in OVS
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.020 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.027 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:b0:ef 10.100.0.9'], port_security=['fa:16:3e:1f:b0:ef 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '975e14fd-3f0a-458d-81e0-ff0cd2344d14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87043e904d374d2fbc50a010c14c8987', 'neutron:revision_number': '14', 'neutron:security_group_ids': '624104e9-162a-4121-9113-49a4c95099e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a17ccb-74c9-404e-969f-7001a470c860, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=5248d484-2946-4983-9716-18e4eed7c94c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.028 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 5248d484-2946-4983-9716-18e4eed7c94c in datapath f37b00da-2392-46ae-ac87-2c54ab8961a2 unbound from our chassis
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.029 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f37b00da-2392-46ae-ac87-2c54ab8961a2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.029 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f980ea5f-0578-482c-8a21-b313cb770c88]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.030 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2 namespace which is not needed anymore
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.034 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:37 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Feb 19 19:34:37 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Consumed 2.581s CPU time.
Feb 19 19:34:37 compute-0 systemd-machined[156014]: Machine qemu-10-instance-0000000c terminated.
Feb 19 19:34:37 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[211851]: [NOTICE]   (211855) : haproxy version is 3.0.5-8e879a5
Feb 19 19:34:37 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[211851]: [NOTICE]   (211855) : path to executable is /usr/sbin/haproxy
Feb 19 19:34:37 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[211851]: [WARNING]  (211855) : Exiting Master process...
Feb 19 19:34:37 compute-0 podman[212244]: 2026-02-19 19:34:37.134353704 +0000 UTC m=+0.030212002 container kill 075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0)
Feb 19 19:34:37 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[211851]: [ALERT]    (211855) : Current worker (211857) exited with code 143 (Terminated)
Feb 19 19:34:37 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[211851]: [WARNING]  (211855) : All workers exited. Exiting... (0)
Feb 19 19:34:37 compute-0 systemd[1]: libpod-075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be.scope: Deactivated successfully.
Feb 19 19:34:37 compute-0 podman[212262]: 2026-02-19 19:34:37.180790769 +0000 UTC m=+0.021619520 container died 075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Feb 19 19:34:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be-userdata-shm.mount: Deactivated successfully.
Feb 19 19:34:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-12bed4a781ee87158df8114fc420396a9d9978c0b4881e96d2a76877fe1f11cf-merged.mount: Deactivated successfully.
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.205 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.209 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:37 compute-0 podman[212262]: 2026-02-19 19:34:37.223287909 +0000 UTC m=+0.064116660 container remove 075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 19 19:34:37 compute-0 systemd[1]: libpod-conmon-075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be.scope: Deactivated successfully.
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.228 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[9865597a-a422-4c16-b4be-e5107ed56a60]: (4, ("Thu Feb 19 07:34:37 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2 (075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be)\n075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be\nThu Feb 19 07:34:37 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2 (075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be)\n075df9ee4106fa07b920228f993bf4f48d8e5dd23e0526b5b02d2e4b563fe3be\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.231 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[75e464cf-92a8-4e3c-8e14-178fa0413b00]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.231 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.232 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[90d9712c-f0c9-488a-8f90-783af1986d20]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.232 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf37b00da-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.234 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:37 compute-0 kernel: tapf37b00da-20: left promiscuous mode
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.244 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.246 186666 INFO nova.virt.libvirt.driver [-] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Instance destroyed successfully.
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.247 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[58b7958b-ce15-49cb-a035-7a3ed576c133]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.247 186666 DEBUG nova.objects.instance [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lazy-loading 'resources' on Instance uuid 975e14fd-3f0a-458d-81e0-ff0cd2344d14 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.258 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[6d37e6b1-c848-4513-bff6-b1a2f33f9c37]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.259 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[14765885-8307-4194-b972-2962fe0972ee]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.269 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[dee149a1-6186-468f-9cd0-c085ec42dfc2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388467, 'reachable_time': 18121, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212310, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.271 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:34:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:34:37.271 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[3406160d-8c90-4acd-9078-472207031871]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:34:37 compute-0 systemd[1]: run-netns-ovnmeta\x2df37b00da\x2d2392\x2d46ae\x2dac87\x2d2c54ab8961a2.mount: Deactivated successfully.
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.755 186666 DEBUG nova.virt.libvirt.vif [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-02-19T19:33:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1847271986',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1847271986',id=12,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:33:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='87043e904d374d2fbc50a010c14c8987',ramdisk_id='',reservation_id='r-fuu10k82',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-488758453',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:34:16Z,user_data=None,user_id='f74ba8e1becb4d8f83bb148785aac310',uuid=975e14fd-3f0a-458d-81e0-ff0cd2344d14,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5248d484-2946-4983-9716-18e4eed7c94c", "address": "fa:16:3e:1f:b0:ef", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5248d484-29", "ovs_interfaceid": "5248d484-2946-4983-9716-18e4eed7c94c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.756 186666 DEBUG nova.network.os_vif_util [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converting VIF {"id": "5248d484-2946-4983-9716-18e4eed7c94c", "address": "fa:16:3e:1f:b0:ef", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5248d484-29", "ovs_interfaceid": "5248d484-2946-4983-9716-18e4eed7c94c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.757 186666 DEBUG nova.network.os_vif_util [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:b0:ef,bridge_name='br-int',has_traffic_filtering=True,id=5248d484-2946-4983-9716-18e4eed7c94c,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5248d484-29') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.758 186666 DEBUG os_vif [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:b0:ef,bridge_name='br-int',has_traffic_filtering=True,id=5248d484-2946-4983-9716-18e4eed7c94c,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5248d484-29') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.759 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.760 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5248d484-29, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.762 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.763 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.764 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.765 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=637dcbc4-bb68-47d8-b8fa-59bcffc3bb80) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.766 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.767 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.769 186666 INFO os_vif [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:b0:ef,bridge_name='br-int',has_traffic_filtering=True,id=5248d484-2946-4983-9716-18e4eed7c94c,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5248d484-29')
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.770 186666 INFO nova.virt.libvirt.driver [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Deleting instance files /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14_del
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.771 186666 INFO nova.virt.libvirt.driver [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Deletion of /var/lib/nova/instances/975e14fd-3f0a-458d-81e0-ff0cd2344d14_del complete
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.780 186666 DEBUG nova.compute.manager [req-2656a020-26a9-41aa-8a29-1a662d07beca req-58e9e2b1-7f1c-4e01-843a-fd47919c930e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Received event network-vif-unplugged-5248d484-2946-4983-9716-18e4eed7c94c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.781 186666 DEBUG oslo_concurrency.lockutils [req-2656a020-26a9-41aa-8a29-1a662d07beca req-58e9e2b1-7f1c-4e01-843a-fd47919c930e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "975e14fd-3f0a-458d-81e0-ff0cd2344d14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.781 186666 DEBUG oslo_concurrency.lockutils [req-2656a020-26a9-41aa-8a29-1a662d07beca req-58e9e2b1-7f1c-4e01-843a-fd47919c930e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "975e14fd-3f0a-458d-81e0-ff0cd2344d14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.781 186666 DEBUG oslo_concurrency.lockutils [req-2656a020-26a9-41aa-8a29-1a662d07beca req-58e9e2b1-7f1c-4e01-843a-fd47919c930e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "975e14fd-3f0a-458d-81e0-ff0cd2344d14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.782 186666 DEBUG nova.compute.manager [req-2656a020-26a9-41aa-8a29-1a662d07beca req-58e9e2b1-7f1c-4e01-843a-fd47919c930e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] No waiting events found dispatching network-vif-unplugged-5248d484-2946-4983-9716-18e4eed7c94c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:34:37 compute-0 nova_compute[186662]: 2026-02-19 19:34:37.782 186666 DEBUG nova.compute.manager [req-2656a020-26a9-41aa-8a29-1a662d07beca req-58e9e2b1-7f1c-4e01-843a-fd47919c930e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Received event network-vif-unplugged-5248d484-2946-4983-9716-18e4eed7c94c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:34:38 compute-0 nova_compute[186662]: 2026-02-19 19:34:38.287 186666 INFO nova.compute.manager [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Took 1.30 seconds to destroy the instance on the hypervisor.
Feb 19 19:34:38 compute-0 nova_compute[186662]: 2026-02-19 19:34:38.288 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:34:38 compute-0 nova_compute[186662]: 2026-02-19 19:34:38.288 186666 DEBUG nova.compute.manager [-] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:34:38 compute-0 nova_compute[186662]: 2026-02-19 19:34:38.288 186666 DEBUG nova.network.neutron [-] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:34:38 compute-0 nova_compute[186662]: 2026-02-19 19:34:38.288 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:38 compute-0 nova_compute[186662]: 2026-02-19 19:34:38.291 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:38 compute-0 nova_compute[186662]: 2026-02-19 19:34:38.716 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:34:39 compute-0 nova_compute[186662]: 2026-02-19 19:34:39.047 186666 DEBUG nova.compute.manager [req-2590bfff-2bb3-422f-8152-972a26961abf req-1f483187-d815-455f-91d3-8e64cd8c3c4e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Received event network-vif-deleted-5248d484-2946-4983-9716-18e4eed7c94c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:34:39 compute-0 nova_compute[186662]: 2026-02-19 19:34:39.047 186666 INFO nova.compute.manager [req-2590bfff-2bb3-422f-8152-972a26961abf req-1f483187-d815-455f-91d3-8e64cd8c3c4e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Neutron deleted interface 5248d484-2946-4983-9716-18e4eed7c94c; detaching it from the instance and deleting it from the info cache
Feb 19 19:34:39 compute-0 nova_compute[186662]: 2026-02-19 19:34:39.047 186666 DEBUG nova.network.neutron [req-2590bfff-2bb3-422f-8152-972a26961abf req-1f483187-d815-455f-91d3-8e64cd8c3c4e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:34:39 compute-0 podman[212311]: 2026-02-19 19:34:39.284829387 +0000 UTC m=+0.061744536 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Feb 19 19:34:39 compute-0 nova_compute[186662]: 2026-02-19 19:34:39.494 186666 DEBUG nova.network.neutron [-] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:34:39 compute-0 nova_compute[186662]: 2026-02-19 19:34:39.555 186666 DEBUG nova.compute.manager [req-2590bfff-2bb3-422f-8152-972a26961abf req-1f483187-d815-455f-91d3-8e64cd8c3c4e 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Detach interface failed, port_id=5248d484-2946-4983-9716-18e4eed7c94c, reason: Instance 975e14fd-3f0a-458d-81e0-ff0cd2344d14 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:34:39 compute-0 nova_compute[186662]: 2026-02-19 19:34:39.851 186666 DEBUG nova.compute.manager [req-4ba63f64-7997-4693-ad4a-e6dcf1f0cf37 req-6c588347-2974-4deb-aa50-8ec667773f95 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Received event network-vif-unplugged-5248d484-2946-4983-9716-18e4eed7c94c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:34:39 compute-0 nova_compute[186662]: 2026-02-19 19:34:39.852 186666 DEBUG oslo_concurrency.lockutils [req-4ba63f64-7997-4693-ad4a-e6dcf1f0cf37 req-6c588347-2974-4deb-aa50-8ec667773f95 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "975e14fd-3f0a-458d-81e0-ff0cd2344d14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:39 compute-0 nova_compute[186662]: 2026-02-19 19:34:39.854 186666 DEBUG oslo_concurrency.lockutils [req-4ba63f64-7997-4693-ad4a-e6dcf1f0cf37 req-6c588347-2974-4deb-aa50-8ec667773f95 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "975e14fd-3f0a-458d-81e0-ff0cd2344d14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:39 compute-0 nova_compute[186662]: 2026-02-19 19:34:39.855 186666 DEBUG oslo_concurrency.lockutils [req-4ba63f64-7997-4693-ad4a-e6dcf1f0cf37 req-6c588347-2974-4deb-aa50-8ec667773f95 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "975e14fd-3f0a-458d-81e0-ff0cd2344d14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:39 compute-0 nova_compute[186662]: 2026-02-19 19:34:39.855 186666 DEBUG nova.compute.manager [req-4ba63f64-7997-4693-ad4a-e6dcf1f0cf37 req-6c588347-2974-4deb-aa50-8ec667773f95 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] No waiting events found dispatching network-vif-unplugged-5248d484-2946-4983-9716-18e4eed7c94c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:34:39 compute-0 nova_compute[186662]: 2026-02-19 19:34:39.856 186666 DEBUG nova.compute.manager [req-4ba63f64-7997-4693-ad4a-e6dcf1f0cf37 req-6c588347-2974-4deb-aa50-8ec667773f95 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Received event network-vif-unplugged-5248d484-2946-4983-9716-18e4eed7c94c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:34:40 compute-0 nova_compute[186662]: 2026-02-19 19:34:40.001 186666 INFO nova.compute.manager [-] [instance: 975e14fd-3f0a-458d-81e0-ff0cd2344d14] Took 1.71 seconds to deallocate network for instance.
Feb 19 19:34:40 compute-0 nova_compute[186662]: 2026-02-19 19:34:40.524 186666 DEBUG oslo_concurrency.lockutils [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:34:40 compute-0 nova_compute[186662]: 2026-02-19 19:34:40.525 186666 DEBUG oslo_concurrency.lockutils [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:34:40 compute-0 nova_compute[186662]: 2026-02-19 19:34:40.579 186666 DEBUG nova.compute.provider_tree [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:34:41 compute-0 nova_compute[186662]: 2026-02-19 19:34:41.087 186666 DEBUG nova.scheduler.client.report [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:34:41 compute-0 nova_compute[186662]: 2026-02-19 19:34:41.598 186666 DEBUG oslo_concurrency.lockutils [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.072s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:41 compute-0 nova_compute[186662]: 2026-02-19 19:34:41.625 186666 INFO nova.scheduler.client.report [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Deleted allocations for instance 975e14fd-3f0a-458d-81e0-ff0cd2344d14
Feb 19 19:34:42 compute-0 nova_compute[186662]: 2026-02-19 19:34:42.694 186666 DEBUG oslo_concurrency.lockutils [None req-236bd366-51ed-4b8c-bbfb-8680cc9aee7d f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "975e14fd-3f0a-458d-81e0-ff0cd2344d14" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.234s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:34:42 compute-0 nova_compute[186662]: 2026-02-19 19:34:42.767 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:43 compute-0 sshd-session[212333]: Invalid user admin from 96.78.175.42 port 35194
Feb 19 19:34:43 compute-0 podman[212335]: 2026-02-19 19:34:43.243475385 +0000 UTC m=+0.067582653 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 19 19:34:43 compute-0 sshd-session[212333]: Received disconnect from 96.78.175.42 port 35194:11: Bye Bye [preauth]
Feb 19 19:34:43 compute-0 sshd-session[212333]: Disconnected from invalid user admin 96.78.175.42 port 35194 [preauth]
Feb 19 19:34:43 compute-0 nova_compute[186662]: 2026-02-19 19:34:43.335 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:45 compute-0 podman[212357]: 2026-02-19 19:34:45.316877132 +0000 UTC m=+0.085788992 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:34:47 compute-0 nova_compute[186662]: 2026-02-19 19:34:47.768 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:48 compute-0 nova_compute[186662]: 2026-02-19 19:34:48.372 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:50 compute-0 podman[212383]: 2026-02-19 19:34:50.292771076 +0000 UTC m=+0.067733837 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 19:34:51 compute-0 sshd-session[212410]: Received disconnect from 91.224.92.108 port 43616:11:  [preauth]
Feb 19 19:34:51 compute-0 sshd-session[212410]: Disconnected from authenticating user root 91.224.92.108 port 43616 [preauth]
Feb 19 19:34:52 compute-0 nova_compute[186662]: 2026-02-19 19:34:52.770 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:53 compute-0 nova_compute[186662]: 2026-02-19 19:34:53.395 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:57 compute-0 nova_compute[186662]: 2026-02-19 19:34:57.772 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:58 compute-0 nova_compute[186662]: 2026-02-19 19:34:58.449 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:34:59 compute-0 podman[196025]: time="2026-02-19T19:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:34:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:34:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Feb 19 19:35:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:01.109 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:35:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:01.109 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:35:01 compute-0 nova_compute[186662]: 2026-02-19 19:35:01.109 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:01 compute-0 openstack_network_exporter[198916]: ERROR   19:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:35:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:35:01 compute-0 openstack_network_exporter[198916]: ERROR   19:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:35:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:35:02 compute-0 nova_compute[186662]: 2026-02-19 19:35:02.774 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:03 compute-0 nova_compute[186662]: 2026-02-19 19:35:03.450 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:04 compute-0 sshd-session[212413]: Invalid user n8n from 182.75.216.74 port 62617
Feb 19 19:35:04 compute-0 sshd-session[212413]: Received disconnect from 182.75.216.74 port 62617:11: Bye Bye [preauth]
Feb 19 19:35:04 compute-0 sshd-session[212413]: Disconnected from invalid user n8n 182.75.216.74 port 62617 [preauth]
Feb 19 19:35:06 compute-0 nova_compute[186662]: 2026-02-19 19:35:06.664 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:35:06 compute-0 nova_compute[186662]: 2026-02-19 19:35:06.664 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:35:07 compute-0 nova_compute[186662]: 2026-02-19 19:35:07.170 186666 DEBUG nova.compute.manager [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Feb 19 19:35:07 compute-0 nova_compute[186662]: 2026-02-19 19:35:07.712 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:35:07 compute-0 nova_compute[186662]: 2026-02-19 19:35:07.712 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:35:07 compute-0 nova_compute[186662]: 2026-02-19 19:35:07.717 186666 DEBUG nova.virt.hardware [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:35:07 compute-0 nova_compute[186662]: 2026-02-19 19:35:07.718 186666 INFO nova.compute.claims [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:35:07 compute-0 nova_compute[186662]: 2026-02-19 19:35:07.775 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:08 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:08.111 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:08 compute-0 nova_compute[186662]: 2026-02-19 19:35:08.490 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:08 compute-0 nova_compute[186662]: 2026-02-19 19:35:08.764 186666 DEBUG nova.compute.provider_tree [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:35:09 compute-0 nova_compute[186662]: 2026-02-19 19:35:09.272 186666 DEBUG nova.scheduler.client.report [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:35:09 compute-0 nova_compute[186662]: 2026-02-19 19:35:09.781 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.068s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:35:09 compute-0 nova_compute[186662]: 2026-02-19 19:35:09.782 186666 DEBUG nova.compute.manager [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Feb 19 19:35:10 compute-0 podman[212415]: 2026-02-19 19:35:10.279871813 +0000 UTC m=+0.055405337 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:35:10 compute-0 nova_compute[186662]: 2026-02-19 19:35:10.294 186666 DEBUG nova.compute.manager [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Feb 19 19:35:10 compute-0 nova_compute[186662]: 2026-02-19 19:35:10.295 186666 DEBUG nova.network.neutron [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Feb 19 19:35:10 compute-0 nova_compute[186662]: 2026-02-19 19:35:10.295 186666 WARNING neutronclient.v2_0.client [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:10 compute-0 nova_compute[186662]: 2026-02-19 19:35:10.295 186666 WARNING neutronclient.v2_0.client [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:10 compute-0 nova_compute[186662]: 2026-02-19 19:35:10.821 186666 INFO nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 19 19:35:10 compute-0 nova_compute[186662]: 2026-02-19 19:35:10.834 186666 DEBUG nova.network.neutron [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Successfully created port: 8b59ae32-fc0b-48f7-8566-afb3e4db1434 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Feb 19 19:35:11 compute-0 nova_compute[186662]: 2026-02-19 19:35:11.329 186666 DEBUG nova.compute.manager [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Feb 19 19:35:11 compute-0 nova_compute[186662]: 2026-02-19 19:35:11.936 186666 DEBUG nova.network.neutron [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Successfully updated port: 8b59ae32-fc0b-48f7-8566-afb3e4db1434 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Feb 19 19:35:11 compute-0 nova_compute[186662]: 2026-02-19 19:35:11.988 186666 DEBUG nova.compute.manager [req-28595787-9ad3-450a-9601-9c406da9d7be req-4e70d1cf-c81c-49be-a6b1-7c63f737efe8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Received event network-changed-8b59ae32-fc0b-48f7-8566-afb3e4db1434 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:35:11 compute-0 nova_compute[186662]: 2026-02-19 19:35:11.988 186666 DEBUG nova.compute.manager [req-28595787-9ad3-450a-9601-9c406da9d7be req-4e70d1cf-c81c-49be-a6b1-7c63f737efe8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Refreshing instance network info cache due to event network-changed-8b59ae32-fc0b-48f7-8566-afb3e4db1434. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:35:11 compute-0 nova_compute[186662]: 2026-02-19 19:35:11.988 186666 DEBUG oslo_concurrency.lockutils [req-28595787-9ad3-450a-9601-9c406da9d7be req-4e70d1cf-c81c-49be-a6b1-7c63f737efe8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:35:11 compute-0 nova_compute[186662]: 2026-02-19 19:35:11.988 186666 DEBUG oslo_concurrency.lockutils [req-28595787-9ad3-450a-9601-9c406da9d7be req-4e70d1cf-c81c-49be-a6b1-7c63f737efe8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:35:11 compute-0 nova_compute[186662]: 2026-02-19 19:35:11.988 186666 DEBUG nova.network.neutron [req-28595787-9ad3-450a-9601-9c406da9d7be req-4e70d1cf-c81c-49be-a6b1-7c63f737efe8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Refreshing network info cache for port 8b59ae32-fc0b-48f7-8566-afb3e4db1434 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.347 186666 DEBUG nova.compute.manager [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.348 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.348 186666 INFO nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Creating image(s)
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.349 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "/var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.349 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "/var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.349 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "/var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.350 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.353 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.355 186666 DEBUG oslo_concurrency.processutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.410 186666 DEBUG oslo_concurrency.processutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.411 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.411 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.411 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.414 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.414 186666 DEBUG oslo_concurrency.processutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.441 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "refresh_cache-ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.463 186666 DEBUG oslo_concurrency.processutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.464 186666 DEBUG oslo_concurrency.processutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.485 186666 DEBUG oslo_concurrency.processutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.486 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.075s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.486 186666 DEBUG oslo_concurrency.processutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.493 186666 WARNING neutronclient.v2_0.client [req-28595787-9ad3-450a-9601-9c406da9d7be req-4e70d1cf-c81c-49be-a6b1-7c63f737efe8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.524 186666 DEBUG oslo_concurrency.processutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.525 186666 DEBUG nova.virt.disk.api [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Checking if we can resize image /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.525 186666 DEBUG oslo_concurrency.processutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.577 186666 DEBUG oslo_concurrency.processutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.578 186666 DEBUG nova.virt.disk.api [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Cannot resize image /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.578 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.578 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Ensure instance console log exists: /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.579 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.579 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.579 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.727 186666 DEBUG nova.network.neutron [req-28595787-9ad3-450a-9601-9c406da9d7be req-4e70d1cf-c81c-49be-a6b1-7c63f737efe8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.777 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:12 compute-0 nova_compute[186662]: 2026-02-19 19:35:12.901 186666 DEBUG nova.network.neutron [req-28595787-9ad3-450a-9601-9c406da9d7be req-4e70d1cf-c81c-49be-a6b1-7c63f737efe8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:35:13 compute-0 nova_compute[186662]: 2026-02-19 19:35:13.407 186666 DEBUG oslo_concurrency.lockutils [req-28595787-9ad3-450a-9601-9c406da9d7be req-4e70d1cf-c81c-49be-a6b1-7c63f737efe8 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:35:13 compute-0 nova_compute[186662]: 2026-02-19 19:35:13.409 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquired lock "refresh_cache-ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:35:13 compute-0 nova_compute[186662]: 2026-02-19 19:35:13.409 186666 DEBUG nova.network.neutron [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:35:13 compute-0 nova_compute[186662]: 2026-02-19 19:35:13.493 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:14 compute-0 podman[212449]: 2026-02-19 19:35:14.300886964 +0000 UTC m=+0.065734824 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 19 19:35:14 compute-0 nova_compute[186662]: 2026-02-19 19:35:14.722 186666 DEBUG nova.network.neutron [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.121 186666 WARNING neutronclient.v2_0.client [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.340 186666 DEBUG nova.network.neutron [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Updating instance_info_cache with network_info: [{"id": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "address": "fa:16:3e:0f:97:74", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b59ae32-fc", "ovs_interfaceid": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.576 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.846 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Releasing lock "refresh_cache-ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.846 186666 DEBUG nova.compute.manager [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Instance network_info: |[{"id": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "address": "fa:16:3e:0f:97:74", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b59ae32-fc", "ovs_interfaceid": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.848 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Start _get_guest_xml network_info=[{"id": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "address": "fa:16:3e:0f:97:74", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b59ae32-fc", "ovs_interfaceid": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.852 186666 WARNING nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.853 186666 DEBUG nova.virt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-317672826', uuid='ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc'), owner=OwnerMeta(userid='f74ba8e1becb4d8f83bb148785aac310', username='tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin', projectid='87043e904d374d2fbc50a010c14c8987', projectname='tempest-TestExecuteHostMaintenanceStrategy-488758453'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "address": "fa:16:3e:0f:97:74", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b59ae32-fc", "ovs_interfaceid": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771529715.8538003) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.857 186666 DEBUG nova.virt.libvirt.host [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.857 186666 DEBUG nova.virt.libvirt.host [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.860 186666 DEBUG nova.virt.libvirt.host [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.860 186666 DEBUG nova.virt.libvirt.host [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.861 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.861 186666 DEBUG nova.virt.hardware [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:33Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.862 186666 DEBUG nova.virt.hardware [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.862 186666 DEBUG nova.virt.hardware [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.862 186666 DEBUG nova.virt.hardware [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.862 186666 DEBUG nova.virt.hardware [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.862 186666 DEBUG nova.virt.hardware [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.863 186666 DEBUG nova.virt.hardware [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.863 186666 DEBUG nova.virt.hardware [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.863 186666 DEBUG nova.virt.hardware [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.863 186666 DEBUG nova.virt.hardware [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.863 186666 DEBUG nova.virt.hardware [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.867 186666 DEBUG nova.virt.libvirt.vif [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:35:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-317672826',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-317672826',id=15,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='87043e904d374d2fbc50a010c14c8987',ramdisk_id='',reservation_id='r-fyicljro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-488758453',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:35:11Z,user_data=None,user_id='f74ba8e1becb4d8f83bb148785aac310',uuid=ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "address": "fa:16:3e:0f:97:74", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b59ae32-fc", "ovs_interfaceid": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.867 186666 DEBUG nova.network.os_vif_util [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converting VIF {"id": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "address": "fa:16:3e:0f:97:74", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b59ae32-fc", "ovs_interfaceid": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.869 186666 DEBUG nova.network.os_vif_util [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:97:74,bridge_name='br-int',has_traffic_filtering=True,id=8b59ae32-fc0b-48f7-8566-afb3e4db1434,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b59ae32-fc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:35:15 compute-0 nova_compute[186662]: 2026-02-19 19:35:15.869 186666 DEBUG nova.objects.instance [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lazy-loading 'pci_devices' on Instance uuid ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:35:16 compute-0 podman[212471]: 2026-02-19 19:35:16.319353227 +0000 UTC m=+0.091163596 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.377 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:35:16 compute-0 nova_compute[186662]:   <uuid>ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc</uuid>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   <name>instance-0000000f</name>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   <memory>131072</memory>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-317672826</nova:name>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:35:15</nova:creationTime>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:35:16 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:35:16 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:user uuid="f74ba8e1becb4d8f83bb148785aac310">tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin</nova:user>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:project uuid="87043e904d374d2fbc50a010c14c8987">tempest-TestExecuteHostMaintenanceStrategy-488758453</nova:project>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         <nova:port uuid="8b59ae32-fc0b-48f7-8566-afb3e4db1434">
Feb 19 19:35:16 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <system>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <entry name="serial">ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc</entry>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <entry name="uuid">ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc</entry>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     </system>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   <os>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   </os>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   <features>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   </features>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk.config"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:0f:97:74"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <target dev="tap8b59ae32-fc"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/console.log" append="off"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <video>
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     </video>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:35:16 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:35:16 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:35:16 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:35:16 compute-0 nova_compute[186662]: </domain>
Feb 19 19:35:16 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.377 186666 DEBUG nova.compute.manager [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Preparing to wait for external event network-vif-plugged-8b59ae32-fc0b-48f7-8566-afb3e4db1434 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.378 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.378 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.378 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.379 186666 DEBUG nova.virt.libvirt.vif [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:35:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-317672826',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-317672826',id=15,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='87043e904d374d2fbc50a010c14c8987',ramdisk_id='',reservation_id='r-fyicljro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-488758453',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:35:11Z,user_data=None,user_id='f74ba8e1becb4d8f83bb148785aac310',uuid=ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "address": "fa:16:3e:0f:97:74", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b59ae32-fc", "ovs_interfaceid": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.380 186666 DEBUG nova.network.os_vif_util [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converting VIF {"id": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "address": "fa:16:3e:0f:97:74", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b59ae32-fc", "ovs_interfaceid": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.381 186666 DEBUG nova.network.os_vif_util [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:97:74,bridge_name='br-int',has_traffic_filtering=True,id=8b59ae32-fc0b-48f7-8566-afb3e4db1434,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b59ae32-fc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.381 186666 DEBUG os_vif [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:97:74,bridge_name='br-int',has_traffic_filtering=True,id=8b59ae32-fc0b-48f7-8566-afb3e4db1434,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b59ae32-fc') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.382 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.382 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.383 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.383 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.384 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '01acbfc4-9291-51ea-929f-329cc3df9edb', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.385 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.387 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.390 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.390 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b59ae32-fc, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.390 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap8b59ae32-fc, col_values=(('qos', UUID('03af9d84-c79b-47af-ad7c-5ea01c988b23')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.391 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap8b59ae32-fc, col_values=(('external_ids', {'iface-id': '8b59ae32-fc0b-48f7-8566-afb3e4db1434', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0f:97:74', 'vm-uuid': 'ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.392 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:16 compute-0 NetworkManager[56519]: <info>  [1771529716.3931] manager: (tap8b59ae32-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.395 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.401 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:16 compute-0 nova_compute[186662]: 2026-02-19 19:35:16.403 186666 INFO os_vif [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:97:74,bridge_name='br-int',has_traffic_filtering=True,id=8b59ae32-fc0b-48f7-8566-afb3e4db1434,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b59ae32-fc')
Feb 19 19:35:17 compute-0 nova_compute[186662]: 2026-02-19 19:35:17.081 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:35:17 compute-0 nova_compute[186662]: 2026-02-19 19:35:17.081 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Feb 19 19:35:17 compute-0 nova_compute[186662]: 2026-02-19 19:35:17.952 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:35:17 compute-0 nova_compute[186662]: 2026-02-19 19:35:17.952 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:35:17 compute-0 nova_compute[186662]: 2026-02-19 19:35:17.953 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] No VIF found with MAC fa:16:3e:0f:97:74, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:35:17 compute-0 nova_compute[186662]: 2026-02-19 19:35:17.954 186666 INFO nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Using config drive
Feb 19 19:35:18 compute-0 nova_compute[186662]: 2026-02-19 19:35:18.082 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:35:18 compute-0 nova_compute[186662]: 2026-02-19 19:35:18.466 186666 WARNING neutronclient.v2_0.client [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:18 compute-0 nova_compute[186662]: 2026-02-19 19:35:18.531 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:18 compute-0 nova_compute[186662]: 2026-02-19 19:35:18.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:35:18 compute-0 nova_compute[186662]: 2026-02-19 19:35:18.781 186666 INFO nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Creating config drive at /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk.config
Feb 19 19:35:18 compute-0 nova_compute[186662]: 2026-02-19 19:35:18.784 186666 DEBUG oslo_concurrency.processutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp6w1g0ww1 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:18 compute-0 nova_compute[186662]: 2026-02-19 19:35:18.907 186666 DEBUG oslo_concurrency.processutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp6w1g0ww1" returned: 0 in 0.123s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:18 compute-0 kernel: tap8b59ae32-fc: entered promiscuous mode
Feb 19 19:35:18 compute-0 NetworkManager[56519]: <info>  [1771529718.9828] manager: (tap8b59ae32-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Feb 19 19:35:18 compute-0 ovn_controller[96653]: 2026-02-19T19:35:18Z|00122|binding|INFO|Claiming lport 8b59ae32-fc0b-48f7-8566-afb3e4db1434 for this chassis.
Feb 19 19:35:18 compute-0 ovn_controller[96653]: 2026-02-19T19:35:18Z|00123|binding|INFO|8b59ae32-fc0b-48f7-8566-afb3e4db1434: Claiming fa:16:3e:0f:97:74 10.100.0.7
Feb 19 19:35:18 compute-0 nova_compute[186662]: 2026-02-19 19:35:18.987 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:18.993 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:97:74 10.100.0.7'], port_security=['fa:16:3e:0f:97:74 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87043e904d374d2fbc50a010c14c8987', 'neutron:revision_number': '4', 'neutron:security_group_ids': '624104e9-162a-4121-9113-49a4c95099e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a17ccb-74c9-404e-969f-7001a470c860, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=8b59ae32-fc0b-48f7-8566-afb3e4db1434) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:35:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:18.995 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 8b59ae32-fc0b-48f7-8566-afb3e4db1434 in datapath f37b00da-2392-46ae-ac87-2c54ab8961a2 bound to our chassis
Feb 19 19:35:18 compute-0 ovn_controller[96653]: 2026-02-19T19:35:18Z|00124|binding|INFO|Setting lport 8b59ae32-fc0b-48f7-8566-afb3e4db1434 ovn-installed in OVS
Feb 19 19:35:18 compute-0 ovn_controller[96653]: 2026-02-19T19:35:18Z|00125|binding|INFO|Setting lport 8b59ae32-fc0b-48f7-8566-afb3e4db1434 up in Southbound
Feb 19 19:35:18 compute-0 nova_compute[186662]: 2026-02-19 19:35:18.995 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:18.998 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f37b00da-2392-46ae-ac87-2c54ab8961a2
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.004 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.008 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7e186069-6c22-4d6c-bcfc-91ca73d88e6a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.010 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf37b00da-21 in ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.012 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf37b00da-20 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.013 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ba4be687-79b5-4821-b39e-8345fd855f81]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.014 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[4d07df03-90be-43dd-b3a3-860fec5fe408]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 systemd-machined[156014]: New machine qemu-11-instance-0000000f.
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.024 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[77e2b238-3dbf-4ce2-a6ff-4b0a72ca7f72]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.031 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[833bc08f-541a-4763-86f8-0406d94dfcee]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000f.
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.056 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[343c5490-51b7-49dc-b070-e0c8909dd502]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.061 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[00588fce-47f8-4b94-95f5-cb243048e534]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 systemd-udevd[212523]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:35:19 compute-0 systemd-udevd[212524]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:35:19 compute-0 NetworkManager[56519]: <info>  [1771529719.0637] manager: (tapf37b00da-20): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Feb 19 19:35:19 compute-0 NetworkManager[56519]: <info>  [1771529719.0791] device (tap8b59ae32-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:35:19 compute-0 NetworkManager[56519]: <info>  [1771529719.0806] device (tap8b59ae32-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.100 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[4073f7df-65c7-494b-ba85-b6eac34981de]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.105 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[60f5be97-8015-4697-86f6-3458b85139f3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 NetworkManager[56519]: <info>  [1771529719.1322] device (tapf37b00da-20): carrier: link connected
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.139 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[443bd642-62f8-4392-8462-133d2c702702]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.155 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ccae56f3-a884-4550-b779-7438bda98ce0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf37b00da-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:25:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398869, 'reachable_time': 44521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212551, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.166 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ad5859cf-2e6d-4a26-b05f-aed6bad27463]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedc:2582'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398869, 'tstamp': 398869}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212552, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.176 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[1af55868-f0f7-46d2-9c39-9c8f5c72e560]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf37b00da-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:25:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398869, 'reachable_time': 44521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212553, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.196 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c73c5348-57d1-4347-af42-12b5da6d6034]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.234 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[6ffd36e6-1a70-4fed-8e0e-f5552234a2bb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.235 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf37b00da-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.236 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.236 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf37b00da-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:19 compute-0 kernel: tapf37b00da-20: entered promiscuous mode
Feb 19 19:35:19 compute-0 NetworkManager[56519]: <info>  [1771529719.2395] manager: (tapf37b00da-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.240 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.244 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf37b00da-20, col_values=(('external_ids', {'iface-id': 'e682361b-89b8-4c67-9593-6b6d57e3096a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.245 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:19 compute-0 ovn_controller[96653]: 2026-02-19T19:35:19Z|00126|binding|INFO|Releasing lport e682361b-89b8-4c67-9593-6b6d57e3096a from this chassis (sb_readonly=0)
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.247 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[91cf529b-6a32-4e82-9d58-37be1316a5ab]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.248 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.249 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.249 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.249 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f37b00da-2392-46ae-ac87-2c54ab8961a2 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.249 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.250 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[0a0593ee-c400-4659-b2e7-f8d581d2b897]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.251 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.251 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6a7b43-b925-41f2-a188-cb2052c5813c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.252 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-f37b00da-2392-46ae-ac87-2c54ab8961a2
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID f37b00da-2392-46ae-ac87-2c54ab8961a2
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:35:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:19.252 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'env', 'PROCESS_TAG=haproxy-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f37b00da-2392-46ae-ac87-2c54ab8961a2.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Feb 19 19:35:19 compute-0 podman[212592]: 2026-02-19 19:35:19.591457756 +0000 UTC m=+0.046924779 container create 582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:35:19 compute-0 systemd[1]: Started libpod-conmon-582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc.scope.
Feb 19 19:35:19 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:35:19 compute-0 podman[212592]: 2026-02-19 19:35:19.561320598 +0000 UTC m=+0.016787541 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:35:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aac5edc62b3b86fdf0ab164426329eb68194ee646dea4d4c3640f118f3e48414/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:35:19 compute-0 podman[212592]: 2026-02-19 19:35:19.669103098 +0000 UTC m=+0.124570081 container init 582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:35:19 compute-0 podman[212592]: 2026-02-19 19:35:19.67287396 +0000 UTC m=+0.128340933 container start 582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Feb 19 19:35:19 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[212607]: [NOTICE]   (212612) : New worker (212614) forked
Feb 19 19:35:19 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[212607]: [NOTICE]   (212612) : Loading success.
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.805 186666 DEBUG nova.compute.manager [req-009fa76f-ace4-44ee-b022-4f539f4db57d req-869afe25-e2f6-4c11-ae2f-938aa424db9b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Received event network-vif-plugged-8b59ae32-fc0b-48f7-8566-afb3e4db1434 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.805 186666 DEBUG oslo_concurrency.lockutils [req-009fa76f-ace4-44ee-b022-4f539f4db57d req-869afe25-e2f6-4c11-ae2f-938aa424db9b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.806 186666 DEBUG oslo_concurrency.lockutils [req-009fa76f-ace4-44ee-b022-4f539f4db57d req-869afe25-e2f6-4c11-ae2f-938aa424db9b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.806 186666 DEBUG oslo_concurrency.lockutils [req-009fa76f-ace4-44ee-b022-4f539f4db57d req-869afe25-e2f6-4c11-ae2f-938aa424db9b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.806 186666 DEBUG nova.compute.manager [req-009fa76f-ace4-44ee-b022-4f539f4db57d req-869afe25-e2f6-4c11-ae2f-938aa424db9b 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Processing event network-vif-plugged-8b59ae32-fc0b-48f7-8566-afb3e4db1434 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.807 186666 DEBUG nova.compute.manager [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.813 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.817 186666 INFO nova.virt.libvirt.driver [-] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Instance spawned successfully.
Feb 19 19:35:19 compute-0 nova_compute[186662]: 2026-02-19 19:35:19.817 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Feb 19 19:35:20 compute-0 nova_compute[186662]: 2026-02-19 19:35:20.080 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Feb 19 19:35:20 compute-0 nova_compute[186662]: 2026-02-19 19:35:20.331 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:35:20 compute-0 nova_compute[186662]: 2026-02-19 19:35:20.332 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:35:20 compute-0 nova_compute[186662]: 2026-02-19 19:35:20.333 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:35:20 compute-0 nova_compute[186662]: 2026-02-19 19:35:20.334 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:35:20 compute-0 nova_compute[186662]: 2026-02-19 19:35:20.335 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:35:20 compute-0 nova_compute[186662]: 2026-02-19 19:35:20.335 186666 DEBUG nova.virt.libvirt.driver [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:35:20 compute-0 nova_compute[186662]: 2026-02-19 19:35:20.847 186666 INFO nova.compute.manager [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Took 8.50 seconds to spawn the instance on the hypervisor.
Feb 19 19:35:20 compute-0 nova_compute[186662]: 2026-02-19 19:35:20.847 186666 DEBUG nova.compute.manager [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:35:21 compute-0 podman[212623]: 2026-02-19 19:35:21.277834905 +0000 UTC m=+0.050834094 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:35:21 compute-0 nova_compute[186662]: 2026-02-19 19:35:21.378 186666 INFO nova.compute.manager [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Took 13.70 seconds to build instance.
Feb 19 19:35:21 compute-0 nova_compute[186662]: 2026-02-19 19:35:21.392 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:21 compute-0 nova_compute[186662]: 2026-02-19 19:35:21.850 186666 DEBUG nova.compute.manager [req-56607e29-0cf7-4c86-bee5-7efd66951cbe req-09279fe2-d713-48fa-af73-b643036365c9 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Received event network-vif-plugged-8b59ae32-fc0b-48f7-8566-afb3e4db1434 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:35:21 compute-0 nova_compute[186662]: 2026-02-19 19:35:21.850 186666 DEBUG oslo_concurrency.lockutils [req-56607e29-0cf7-4c86-bee5-7efd66951cbe req-09279fe2-d713-48fa-af73-b643036365c9 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:35:21 compute-0 nova_compute[186662]: 2026-02-19 19:35:21.851 186666 DEBUG oslo_concurrency.lockutils [req-56607e29-0cf7-4c86-bee5-7efd66951cbe req-09279fe2-d713-48fa-af73-b643036365c9 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:35:21 compute-0 nova_compute[186662]: 2026-02-19 19:35:21.851 186666 DEBUG oslo_concurrency.lockutils [req-56607e29-0cf7-4c86-bee5-7efd66951cbe req-09279fe2-d713-48fa-af73-b643036365c9 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:35:21 compute-0 nova_compute[186662]: 2026-02-19 19:35:21.851 186666 DEBUG nova.compute.manager [req-56607e29-0cf7-4c86-bee5-7efd66951cbe req-09279fe2-d713-48fa-af73-b643036365c9 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] No waiting events found dispatching network-vif-plugged-8b59ae32-fc0b-48f7-8566-afb3e4db1434 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:35:21 compute-0 nova_compute[186662]: 2026-02-19 19:35:21.852 186666 WARNING nova.compute.manager [req-56607e29-0cf7-4c86-bee5-7efd66951cbe req-09279fe2-d713-48fa-af73-b643036365c9 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Received unexpected event network-vif-plugged-8b59ae32-fc0b-48f7-8566-afb3e4db1434 for instance with vm_state active and task_state None.
Feb 19 19:35:21 compute-0 nova_compute[186662]: 2026-02-19 19:35:21.884 186666 DEBUG oslo_concurrency.lockutils [None req-06c492eb-302f-4f3d-9ba9-07e1d4e456e9 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.219s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:35:22 compute-0 nova_compute[186662]: 2026-02-19 19:35:22.076 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:35:22 compute-0 nova_compute[186662]: 2026-02-19 19:35:22.584 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:35:23 compute-0 nova_compute[186662]: 2026-02-19 19:35:23.094 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:35:23 compute-0 nova_compute[186662]: 2026-02-19 19:35:23.095 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:35:23 compute-0 nova_compute[186662]: 2026-02-19 19:35:23.095 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:35:23 compute-0 nova_compute[186662]: 2026-02-19 19:35:23.095 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:35:23 compute-0 nova_compute[186662]: 2026-02-19 19:35:23.571 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:24 compute-0 nova_compute[186662]: 2026-02-19 19:35:24.134 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:24 compute-0 nova_compute[186662]: 2026-02-19 19:35:24.209 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:24 compute-0 nova_compute[186662]: 2026-02-19 19:35:24.210 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:24 compute-0 nova_compute[186662]: 2026-02-19 19:35:24.256 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:24 compute-0 nova_compute[186662]: 2026-02-19 19:35:24.376 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:35:24 compute-0 nova_compute[186662]: 2026-02-19 19:35:24.378 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:24 compute-0 nova_compute[186662]: 2026-02-19 19:35:24.386 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.009s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:24 compute-0 nova_compute[186662]: 2026-02-19 19:35:24.387 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5674MB free_disk=72.97577667236328GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:35:24 compute-0 nova_compute[186662]: 2026-02-19 19:35:24.388 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:35:24 compute-0 nova_compute[186662]: 2026-02-19 19:35:24.388 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:35:25 compute-0 nova_compute[186662]: 2026-02-19 19:35:25.430 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:35:25 compute-0 nova_compute[186662]: 2026-02-19 19:35:25.432 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:35:25 compute-0 nova_compute[186662]: 2026-02-19 19:35:25.432 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:35:24 up  1:06,  0 user,  load average: 0.45, 0.36, 0.36\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_87043e904d374d2fbc50a010c14c8987': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:35:25 compute-0 nova_compute[186662]: 2026-02-19 19:35:25.474 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:35:25 compute-0 nova_compute[186662]: 2026-02-19 19:35:25.981 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:35:26 compute-0 nova_compute[186662]: 2026-02-19 19:35:26.394 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:26 compute-0 nova_compute[186662]: 2026-02-19 19:35:26.495 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:35:26 compute-0 nova_compute[186662]: 2026-02-19 19:35:26.496 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:35:28 compute-0 sshd-session[212657]: Invalid user oracle from 106.51.64.128 port 29090
Feb 19 19:35:28 compute-0 sshd-session[212657]: Received disconnect from 106.51.64.128 port 29090:11: Bye Bye [preauth]
Feb 19 19:35:28 compute-0 sshd-session[212657]: Disconnected from invalid user oracle 106.51.64.128 port 29090 [preauth]
Feb 19 19:35:28 compute-0 nova_compute[186662]: 2026-02-19 19:35:28.613 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:29 compute-0 podman[196025]: time="2026-02-19T19:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:35:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17220 "" "Go-http-client/1.1"
Feb 19 19:35:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2654 "" "Go-http-client/1.1"
Feb 19 19:35:31 compute-0 nova_compute[186662]: 2026-02-19 19:35:31.397 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:31 compute-0 openstack_network_exporter[198916]: ERROR   19:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:35:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:35:31 compute-0 openstack_network_exporter[198916]: ERROR   19:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:35:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:35:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:32.134 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:35:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:32.135 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:35:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:32.135 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:35:32 compute-0 ovn_controller[96653]: 2026-02-19T19:35:32Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0f:97:74 10.100.0.7
Feb 19 19:35:32 compute-0 ovn_controller[96653]: 2026-02-19T19:35:32Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0f:97:74 10.100.0.7
Feb 19 19:35:33 compute-0 nova_compute[186662]: 2026-02-19 19:35:33.655 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:34 compute-0 nova_compute[186662]: 2026-02-19 19:35:34.273 186666 DEBUG nova.virt.libvirt.driver [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Creating tmpfile /var/lib/nova/instances/tmpw0wdwbly to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Feb 19 19:35:34 compute-0 nova_compute[186662]: 2026-02-19 19:35:34.274 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:34 compute-0 nova_compute[186662]: 2026-02-19 19:35:34.277 186666 DEBUG nova.compute.manager [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw0wdwbly',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Feb 19 19:35:36 compute-0 nova_compute[186662]: 2026-02-19 19:35:36.309 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:36 compute-0 nova_compute[186662]: 2026-02-19 19:35:36.399 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:38 compute-0 nova_compute[186662]: 2026-02-19 19:35:38.676 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:40 compute-0 nova_compute[186662]: 2026-02-19 19:35:40.178 186666 DEBUG nova.compute.manager [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw0wdwbly',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='df9e632d-e014-4389-bc9d-471df8d0131c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Feb 19 19:35:41 compute-0 nova_compute[186662]: 2026-02-19 19:35:41.195 186666 DEBUG oslo_concurrency.lockutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-df9e632d-e014-4389-bc9d-471df8d0131c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:35:41 compute-0 nova_compute[186662]: 2026-02-19 19:35:41.196 186666 DEBUG oslo_concurrency.lockutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-df9e632d-e014-4389-bc9d-471df8d0131c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:35:41 compute-0 nova_compute[186662]: 2026-02-19 19:35:41.196 186666 DEBUG nova.network.neutron [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:35:41 compute-0 podman[212675]: 2026-02-19 19:35:41.306745934 +0000 UTC m=+0.069831797 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 19 19:35:41 compute-0 nova_compute[186662]: 2026-02-19 19:35:41.400 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:41 compute-0 nova_compute[186662]: 2026-02-19 19:35:41.703 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:43 compute-0 nova_compute[186662]: 2026-02-19 19:35:43.532 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:43 compute-0 nova_compute[186662]: 2026-02-19 19:35:43.658 186666 DEBUG nova.network.neutron [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Updating instance_info_cache with network_info: [{"id": "015734ec-d897-4090-8bc9-7d5e71101ac0", "address": "fa:16:3e:d9:bd:9f", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap015734ec-d8", "ovs_interfaceid": "015734ec-d897-4090-8bc9-7d5e71101ac0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:35:43 compute-0 nova_compute[186662]: 2026-02-19 19:35:43.698 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.163 186666 DEBUG oslo_concurrency.lockutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-df9e632d-e014-4389-bc9d-471df8d0131c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.177 186666 DEBUG nova.virt.libvirt.driver [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw0wdwbly',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='df9e632d-e014-4389-bc9d-471df8d0131c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.178 186666 DEBUG nova.virt.libvirt.driver [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Creating instance directory: /var/lib/nova/instances/df9e632d-e014-4389-bc9d-471df8d0131c pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.178 186666 DEBUG nova.virt.libvirt.driver [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Creating disk.info with the contents: {'/var/lib/nova/instances/df9e632d-e014-4389-bc9d-471df8d0131c/disk': 'qcow2', '/var/lib/nova/instances/df9e632d-e014-4389-bc9d-471df8d0131c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.179 186666 DEBUG nova.virt.libvirt.driver [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.179 186666 DEBUG nova.objects.instance [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid df9e632d-e014-4389-bc9d-471df8d0131c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.685 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.690 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.692 186666 DEBUG oslo_concurrency.processutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.728 186666 DEBUG oslo_concurrency.processutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.729 186666 DEBUG oslo_concurrency.lockutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.730 186666 DEBUG oslo_concurrency.lockutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.730 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.734 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.735 186666 DEBUG oslo_concurrency.processutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.775 186666 DEBUG oslo_concurrency.processutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.776 186666 DEBUG oslo_concurrency.processutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/df9e632d-e014-4389-bc9d-471df8d0131c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.797 186666 DEBUG oslo_concurrency.processutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/df9e632d-e014-4389-bc9d-471df8d0131c/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.798 186666 DEBUG oslo_concurrency.lockutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.069s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.799 186666 DEBUG oslo_concurrency.processutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.838 186666 DEBUG oslo_concurrency.processutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.839 186666 DEBUG nova.virt.disk.api [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Checking if we can resize image /var/lib/nova/instances/df9e632d-e014-4389-bc9d-471df8d0131c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.839 186666 DEBUG oslo_concurrency.processutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df9e632d-e014-4389-bc9d-471df8d0131c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.880 186666 DEBUG oslo_concurrency.processutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df9e632d-e014-4389-bc9d-471df8d0131c/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.881 186666 DEBUG nova.virt.disk.api [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Cannot resize image /var/lib/nova/instances/df9e632d-e014-4389-bc9d-471df8d0131c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:35:44 compute-0 nova_compute[186662]: 2026-02-19 19:35:44.881 186666 DEBUG nova.objects.instance [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid df9e632d-e014-4389-bc9d-471df8d0131c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:35:45 compute-0 podman[212710]: 2026-02-19 19:35:45.286046759 +0000 UTC m=+0.061846534 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, architecture=x86_64, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.387 186666 DEBUG nova.objects.base [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<df9e632d-e014-4389-bc9d-471df8d0131c> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.388 186666 DEBUG oslo_concurrency.processutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/df9e632d-e014-4389-bc9d-471df8d0131c/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.411 186666 DEBUG oslo_concurrency.processutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/df9e632d-e014-4389-bc9d-471df8d0131c/disk.config 497664" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.412 186666 DEBUG nova.virt.libvirt.driver [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.413 186666 DEBUG nova.virt.libvirt.vif [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-02-19T19:34:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-2073515523',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-2073515523',id=14,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:35:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='87043e904d374d2fbc50a010c14c8987',ramdisk_id='',reservation_id='r-yx5r32yr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-488758453',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:35:02Z,user_data=None,user_id='f74ba8e1becb4d8f83bb148785aac310',uuid=df9e632d-e014-4389-bc9d-471df8d0131c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "015734ec-d897-4090-8bc9-7d5e71101ac0", "address": "fa:16:3e:d9:bd:9f", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap015734ec-d8", "ovs_interfaceid": "015734ec-d897-4090-8bc9-7d5e71101ac0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.414 186666 DEBUG nova.network.os_vif_util [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "015734ec-d897-4090-8bc9-7d5e71101ac0", "address": "fa:16:3e:d9:bd:9f", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap015734ec-d8", "ovs_interfaceid": "015734ec-d897-4090-8bc9-7d5e71101ac0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.414 186666 DEBUG nova.network.os_vif_util [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:bd:9f,bridge_name='br-int',has_traffic_filtering=True,id=015734ec-d897-4090-8bc9-7d5e71101ac0,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap015734ec-d8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.415 186666 DEBUG os_vif [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:bd:9f,bridge_name='br-int',has_traffic_filtering=True,id=015734ec-d897-4090-8bc9-7d5e71101ac0,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap015734ec-d8') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.416 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.416 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.416 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.417 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.418 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'acb56c3a-72e6-509a-90de-d69f571c990c', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.419 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.421 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.424 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.425 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap015734ec-d8, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.425 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap015734ec-d8, col_values=(('qos', UUID('37b5eedd-0574-4c47-a837-54a621fd49cb')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.426 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap015734ec-d8, col_values=(('external_ids', {'iface-id': '015734ec-d897-4090-8bc9-7d5e71101ac0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:bd:9f', 'vm-uuid': 'df9e632d-e014-4389-bc9d-471df8d0131c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.427 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:45 compute-0 NetworkManager[56519]: <info>  [1771529745.4281] manager: (tap015734ec-d8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.430 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.435 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.436 186666 INFO os_vif [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:bd:9f,bridge_name='br-int',has_traffic_filtering=True,id=015734ec-d897-4090-8bc9-7d5e71101ac0,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap015734ec-d8')
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.437 186666 DEBUG nova.virt.libvirt.driver [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.437 186666 DEBUG nova.compute.manager [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw0wdwbly',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='df9e632d-e014-4389-bc9d-471df8d0131c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.438 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:45 compute-0 nova_compute[186662]: 2026-02-19 19:35:45.669 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:46 compute-0 nova_compute[186662]: 2026-02-19 19:35:46.311 186666 DEBUG nova.network.neutron [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Port 015734ec-d897-4090-8bc9-7d5e71101ac0 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Feb 19 19:35:46 compute-0 nova_compute[186662]: 2026-02-19 19:35:46.326 186666 DEBUG nova.compute.manager [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw0wdwbly',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='df9e632d-e014-4389-bc9d-471df8d0131c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Feb 19 19:35:47 compute-0 podman[212736]: 2026-02-19 19:35:47.382643648 +0000 UTC m=+0.154980742 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 19 19:35:48 compute-0 nova_compute[186662]: 2026-02-19 19:35:48.723 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:49 compute-0 ovn_controller[96653]: 2026-02-19T19:35:49Z|00127|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 19 19:35:49 compute-0 kernel: tap015734ec-d8: entered promiscuous mode
Feb 19 19:35:49 compute-0 NetworkManager[56519]: <info>  [1771529749.7789] manager: (tap015734ec-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Feb 19 19:35:49 compute-0 systemd-udevd[212775]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:35:49 compute-0 ovn_controller[96653]: 2026-02-19T19:35:49Z|00128|binding|INFO|Claiming lport 015734ec-d897-4090-8bc9-7d5e71101ac0 for this additional chassis.
Feb 19 19:35:49 compute-0 ovn_controller[96653]: 2026-02-19T19:35:49Z|00129|binding|INFO|015734ec-d897-4090-8bc9-7d5e71101ac0: Claiming fa:16:3e:d9:bd:9f 10.100.0.10
Feb 19 19:35:49 compute-0 nova_compute[186662]: 2026-02-19 19:35:49.839 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.849 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:bd:9f 10.100.0.10'], port_security=['fa:16:3e:d9:bd:9f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'df9e632d-e014-4389-bc9d-471df8d0131c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87043e904d374d2fbc50a010c14c8987', 'neutron:revision_number': '10', 'neutron:security_group_ids': '624104e9-162a-4121-9113-49a4c95099e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a17ccb-74c9-404e-969f-7001a470c860, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=015734ec-d897-4090-8bc9-7d5e71101ac0) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:35:49 compute-0 nova_compute[186662]: 2026-02-19 19:35:49.850 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.850 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 015734ec-d897-4090-8bc9-7d5e71101ac0 in datapath f37b00da-2392-46ae-ac87-2c54ab8961a2 unbound from our chassis
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.853 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f37b00da-2392-46ae-ac87-2c54ab8961a2
Feb 19 19:35:49 compute-0 ovn_controller[96653]: 2026-02-19T19:35:49Z|00130|binding|INFO|Setting lport 015734ec-d897-4090-8bc9-7d5e71101ac0 ovn-installed in OVS
Feb 19 19:35:49 compute-0 nova_compute[186662]: 2026-02-19 19:35:49.855 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:49 compute-0 NetworkManager[56519]: <info>  [1771529749.8600] device (tap015734ec-d8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:35:49 compute-0 NetworkManager[56519]: <info>  [1771529749.8613] device (tap015734ec-d8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:35:49 compute-0 systemd-machined[156014]: New machine qemu-12-instance-0000000e.
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.869 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[09a22dc2-036c-4d3e-adff-1f98d4e2d4cf]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:49 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000e.
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.893 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[c58cf317-c7b3-4f0e-98d8-8cd279ccb7b8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.896 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[72d2406e-f974-4b6b-88d2-c00661410bd4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.922 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[5912f897-1aa0-4f25-b6dd-2be7f920a6ef]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.937 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[71376eff-9473-4d94-b698-aa7b18ee8a2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf37b00da-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:25:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398869, 'reachable_time': 44521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212791, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.952 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[fc7f8ff1-d005-48de-9d09-ddffa8547e82]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf37b00da-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398877, 'tstamp': 398877}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212793, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf37b00da-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398879, 'tstamp': 398879}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212793, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.953 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf37b00da-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:49 compute-0 nova_compute[186662]: 2026-02-19 19:35:49.955 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:49 compute-0 nova_compute[186662]: 2026-02-19 19:35:49.956 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.957 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf37b00da-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.957 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.957 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf37b00da-20, col_values=(('external_ids', {'iface-id': 'e682361b-89b8-4c67-9593-6b6d57e3096a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.957 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:35:49 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:35:49.959 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[8181f2d1-ff09-4824-9af8-968f0a385e5b]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f37b00da-2392-46ae-ac87-2c54ab8961a2\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f37b00da-2392-46ae-ac87-2c54ab8961a2\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:35:50 compute-0 nova_compute[186662]: 2026-02-19 19:35:50.427 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:52 compute-0 podman[212816]: 2026-02-19 19:35:52.299581282 +0000 UTC m=+0.063786271 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:35:53 compute-0 ovn_controller[96653]: 2026-02-19T19:35:53Z|00131|binding|INFO|Claiming lport 015734ec-d897-4090-8bc9-7d5e71101ac0 for this chassis.
Feb 19 19:35:53 compute-0 ovn_controller[96653]: 2026-02-19T19:35:53Z|00132|binding|INFO|015734ec-d897-4090-8bc9-7d5e71101ac0: Claiming fa:16:3e:d9:bd:9f 10.100.0.10
Feb 19 19:35:53 compute-0 ovn_controller[96653]: 2026-02-19T19:35:53Z|00133|binding|INFO|Setting lport 015734ec-d897-4090-8bc9-7d5e71101ac0 up in Southbound
Feb 19 19:35:53 compute-0 nova_compute[186662]: 2026-02-19 19:35:53.775 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:54 compute-0 nova_compute[186662]: 2026-02-19 19:35:54.662 186666 INFO nova.compute.manager [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Post operation of migration started
Feb 19 19:35:54 compute-0 nova_compute[186662]: 2026-02-19 19:35:54.662 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:54 compute-0 nova_compute[186662]: 2026-02-19 19:35:54.757 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:54 compute-0 nova_compute[186662]: 2026-02-19 19:35:54.758 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:54 compute-0 nova_compute[186662]: 2026-02-19 19:35:54.847 186666 DEBUG oslo_concurrency.lockutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-df9e632d-e014-4389-bc9d-471df8d0131c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:35:54 compute-0 nova_compute[186662]: 2026-02-19 19:35:54.848 186666 DEBUG oslo_concurrency.lockutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-df9e632d-e014-4389-bc9d-471df8d0131c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:35:54 compute-0 nova_compute[186662]: 2026-02-19 19:35:54.848 186666 DEBUG nova.network.neutron [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:35:55 compute-0 nova_compute[186662]: 2026-02-19 19:35:55.354 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:55 compute-0 nova_compute[186662]: 2026-02-19 19:35:55.430 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:56 compute-0 nova_compute[186662]: 2026-02-19 19:35:56.705 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:35:56 compute-0 nova_compute[186662]: 2026-02-19 19:35:56.859 186666 DEBUG nova.network.neutron [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Updating instance_info_cache with network_info: [{"id": "015734ec-d897-4090-8bc9-7d5e71101ac0", "address": "fa:16:3e:d9:bd:9f", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap015734ec-d8", "ovs_interfaceid": "015734ec-d897-4090-8bc9-7d5e71101ac0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:35:57 compute-0 nova_compute[186662]: 2026-02-19 19:35:57.369 186666 DEBUG oslo_concurrency.lockutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-df9e632d-e014-4389-bc9d-471df8d0131c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:35:57 compute-0 nova_compute[186662]: 2026-02-19 19:35:57.891 186666 DEBUG oslo_concurrency.lockutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:35:57 compute-0 nova_compute[186662]: 2026-02-19 19:35:57.892 186666 DEBUG oslo_concurrency.lockutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:35:57 compute-0 nova_compute[186662]: 2026-02-19 19:35:57.892 186666 DEBUG oslo_concurrency.lockutils [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:35:57 compute-0 nova_compute[186662]: 2026-02-19 19:35:57.897 186666 INFO nova.virt.libvirt.driver [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 19 19:35:57 compute-0 virtqemud[186157]: Domain id=12 name='instance-0000000e' uuid=df9e632d-e014-4389-bc9d-471df8d0131c is tainted: custom-monitor
Feb 19 19:35:58 compute-0 nova_compute[186662]: 2026-02-19 19:35:58.777 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:35:58 compute-0 nova_compute[186662]: 2026-02-19 19:35:58.904 186666 INFO nova.virt.libvirt.driver [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 19 19:35:59 compute-0 podman[196025]: time="2026-02-19T19:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:35:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17220 "" "Go-http-client/1.1"
Feb 19 19:35:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2650 "" "Go-http-client/1.1"
Feb 19 19:35:59 compute-0 nova_compute[186662]: 2026-02-19 19:35:59.908 186666 INFO nova.virt.libvirt.driver [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 19 19:35:59 compute-0 nova_compute[186662]: 2026-02-19 19:35:59.913 186666 DEBUG nova.compute.manager [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:36:00 compute-0 nova_compute[186662]: 2026-02-19 19:36:00.424 186666 DEBUG nova.objects.instance [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Feb 19 19:36:00 compute-0 nova_compute[186662]: 2026-02-19 19:36:00.431 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:01 compute-0 openstack_network_exporter[198916]: ERROR   19:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:36:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:36:01 compute-0 openstack_network_exporter[198916]: ERROR   19:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:36:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:36:01 compute-0 nova_compute[186662]: 2026-02-19 19:36:01.443 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:36:01 compute-0 nova_compute[186662]: 2026-02-19 19:36:01.903 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:36:01 compute-0 nova_compute[186662]: 2026-02-19 19:36:01.903 186666 WARNING neutronclient.v2_0.client [None req-7a144e94-4d7a-46b1-bb3a-a9e60dee4791 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:36:03 compute-0 sshd-session[212840]: Invalid user claude from 189.165.79.177 port 56542
Feb 19 19:36:03 compute-0 sshd-session[212840]: Received disconnect from 189.165.79.177 port 56542:11: Bye Bye [preauth]
Feb 19 19:36:03 compute-0 sshd-session[212840]: Disconnected from invalid user claude 189.165.79.177 port 56542 [preauth]
Feb 19 19:36:03 compute-0 nova_compute[186662]: 2026-02-19 19:36:03.822 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:05 compute-0 nova_compute[186662]: 2026-02-19 19:36:05.433 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:08 compute-0 nova_compute[186662]: 2026-02-19 19:36:08.858 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:10 compute-0 nova_compute[186662]: 2026-02-19 19:36:10.435 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:10 compute-0 nova_compute[186662]: 2026-02-19 19:36:10.866 186666 DEBUG oslo_concurrency.lockutils [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:36:10 compute-0 nova_compute[186662]: 2026-02-19 19:36:10.867 186666 DEBUG oslo_concurrency.lockutils [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:36:10 compute-0 nova_compute[186662]: 2026-02-19 19:36:10.867 186666 DEBUG oslo_concurrency.lockutils [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:36:10 compute-0 nova_compute[186662]: 2026-02-19 19:36:10.867 186666 DEBUG oslo_concurrency.lockutils [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:36:10 compute-0 nova_compute[186662]: 2026-02-19 19:36:10.868 186666 DEBUG oslo_concurrency.lockutils [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:36:10 compute-0 nova_compute[186662]: 2026-02-19 19:36:10.881 186666 INFO nova.compute.manager [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Terminating instance
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.399 186666 DEBUG nova.compute.manager [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:36:11 compute-0 kernel: tap8b59ae32-fc (unregistering): left promiscuous mode
Feb 19 19:36:11 compute-0 NetworkManager[56519]: <info>  [1771529771.4274] device (tap8b59ae32-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:36:11 compute-0 ovn_controller[96653]: 2026-02-19T19:36:11Z|00134|binding|INFO|Releasing lport 8b59ae32-fc0b-48f7-8566-afb3e4db1434 from this chassis (sb_readonly=0)
Feb 19 19:36:11 compute-0 ovn_controller[96653]: 2026-02-19T19:36:11Z|00135|binding|INFO|Setting lport 8b59ae32-fc0b-48f7-8566-afb3e4db1434 down in Southbound
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.436 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:11 compute-0 ovn_controller[96653]: 2026-02-19T19:36:11Z|00136|binding|INFO|Removing iface tap8b59ae32-fc ovn-installed in OVS
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.439 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.444 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:97:74 10.100.0.7'], port_security=['fa:16:3e:0f:97:74 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87043e904d374d2fbc50a010c14c8987', 'neutron:revision_number': '5', 'neutron:security_group_ids': '624104e9-162a-4121-9113-49a4c95099e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a17ccb-74c9-404e-969f-7001a470c860, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=8b59ae32-fc0b-48f7-8566-afb3e4db1434) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.444 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.445 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 8b59ae32-fc0b-48f7-8566-afb3e4db1434 in datapath f37b00da-2392-46ae-ac87-2c54ab8961a2 unbound from our chassis
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.446 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f37b00da-2392-46ae-ac87-2c54ab8961a2
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.454 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[18de56ba-8967-42d8-b008-05bd345ecc3e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:11 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Feb 19 19:36:11 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Consumed 13.255s CPU time.
Feb 19 19:36:11 compute-0 systemd-machined[156014]: Machine qemu-11-instance-0000000f terminated.
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.473 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[f17101ef-ba7f-488f-b2a2-4b489fd156b3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.476 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[03d90529-8f14-43a1-bfa7-5006f46bdc49]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.490 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc28889-3478-4e8a-a765-75f69ff380a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.500 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[339b4599-61a4-4861-b607-b4431d455b9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf37b00da-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:25:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398869, 'reachable_time': 44521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212865, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.506 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[3b901e26-e75d-4799-89ae-9c0aee401985]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf37b00da-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398877, 'tstamp': 398877}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212871, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf37b00da-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398879, 'tstamp': 398879}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212871, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.508 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf37b00da-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.509 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.512 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.512 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf37b00da-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.513 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.513 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf37b00da-20, col_values=(('external_ids', {'iface-id': 'e682361b-89b8-4c67-9593-6b6d57e3096a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.514 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:36:11 compute-0 podman[212842]: 2026-02-19 19:36:11.515330858 +0000 UTC m=+0.063302189 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.515 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[04de5016-e0ed-449d-8a35-62a5db7b31c0]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f37b00da-2392-46ae-ac87-2c54ab8961a2\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f37b00da-2392-46ae-ac87-2c54ab8961a2\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.643 186666 INFO nova.virt.libvirt.driver [-] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Instance destroyed successfully.
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.643 186666 DEBUG nova.objects.instance [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lazy-loading 'resources' on Instance uuid ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.866 186666 DEBUG nova.compute.manager [req-760fd88e-2498-4cd4-81da-d5a78031d3fc req-7527c321-02a4-4169-8954-018acb061242 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Received event network-vif-unplugged-8b59ae32-fc0b-48f7-8566-afb3e4db1434 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.867 186666 DEBUG oslo_concurrency.lockutils [req-760fd88e-2498-4cd4-81da-d5a78031d3fc req-7527c321-02a4-4169-8954-018acb061242 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.867 186666 DEBUG oslo_concurrency.lockutils [req-760fd88e-2498-4cd4-81da-d5a78031d3fc req-7527c321-02a4-4169-8954-018acb061242 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.867 186666 DEBUG oslo_concurrency.lockutils [req-760fd88e-2498-4cd4-81da-d5a78031d3fc req-7527c321-02a4-4169-8954-018acb061242 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.867 186666 DEBUG nova.compute.manager [req-760fd88e-2498-4cd4-81da-d5a78031d3fc req-7527c321-02a4-4169-8954-018acb061242 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] No waiting events found dispatching network-vif-unplugged-8b59ae32-fc0b-48f7-8566-afb3e4db1434 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.867 186666 DEBUG nova.compute.manager [req-760fd88e-2498-4cd4-81da-d5a78031d3fc req-7527c321-02a4-4169-8954-018acb061242 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Received event network-vif-unplugged-8b59ae32-fc0b-48f7-8566-afb3e4db1434 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:36:11 compute-0 nova_compute[186662]: 2026-02-19 19:36:11.940 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.940 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:36:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:11.941 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.149 186666 DEBUG nova.virt.libvirt.vif [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:35:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-317672826',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-317672826',id=15,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:35:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='87043e904d374d2fbc50a010c14c8987',ramdisk_id='',reservation_id='r-fyicljro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-488758453',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:35:20Z,user_data=None,user_id='f74ba8e1becb4d8f83bb148785aac310',uuid=ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "address": "fa:16:3e:0f:97:74", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b59ae32-fc", "ovs_interfaceid": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.150 186666 DEBUG nova.network.os_vif_util [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converting VIF {"id": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "address": "fa:16:3e:0f:97:74", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b59ae32-fc", "ovs_interfaceid": "8b59ae32-fc0b-48f7-8566-afb3e4db1434", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.150 186666 DEBUG nova.network.os_vif_util [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:97:74,bridge_name='br-int',has_traffic_filtering=True,id=8b59ae32-fc0b-48f7-8566-afb3e4db1434,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b59ae32-fc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.150 186666 DEBUG os_vif [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:97:74,bridge_name='br-int',has_traffic_filtering=True,id=8b59ae32-fc0b-48f7-8566-afb3e4db1434,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b59ae32-fc') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.151 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.152 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b59ae32-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.152 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.153 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.154 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.154 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=03af9d84-c79b-47af-ad7c-5ea01c988b23) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.155 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.155 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.157 186666 INFO os_vif [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:97:74,bridge_name='br-int',has_traffic_filtering=True,id=8b59ae32-fc0b-48f7-8566-afb3e4db1434,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b59ae32-fc')
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.157 186666 INFO nova.virt.libvirt.driver [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Deleting instance files /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc_del
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.158 186666 INFO nova.virt.libvirt.driver [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Deletion of /var/lib/nova/instances/ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc_del complete
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.668 186666 INFO nova.compute.manager [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Took 1.27 seconds to destroy the instance on the hypervisor.
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.669 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.669 186666 DEBUG nova.compute.manager [-] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.669 186666 DEBUG nova.network.neutron [-] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:36:12 compute-0 nova_compute[186662]: 2026-02-19 19:36:12.669 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:36:13 compute-0 nova_compute[186662]: 2026-02-19 19:36:13.296 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:36:13 compute-0 nova_compute[186662]: 2026-02-19 19:36:13.613 186666 DEBUG nova.compute.manager [req-bbfe6a92-3a27-4ba3-aa69-2b4e256beed7 req-f8432d09-70d9-456a-a6c1-bf3d8eba2453 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Received event network-vif-deleted-8b59ae32-fc0b-48f7-8566-afb3e4db1434 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:36:13 compute-0 nova_compute[186662]: 2026-02-19 19:36:13.613 186666 INFO nova.compute.manager [req-bbfe6a92-3a27-4ba3-aa69-2b4e256beed7 req-f8432d09-70d9-456a-a6c1-bf3d8eba2453 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Neutron deleted interface 8b59ae32-fc0b-48f7-8566-afb3e4db1434; detaching it from the instance and deleting it from the info cache
Feb 19 19:36:13 compute-0 nova_compute[186662]: 2026-02-19 19:36:13.613 186666 DEBUG nova.network.neutron [req-bbfe6a92-3a27-4ba3-aa69-2b4e256beed7 req-f8432d09-70d9-456a-a6c1-bf3d8eba2453 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:36:13 compute-0 nova_compute[186662]: 2026-02-19 19:36:13.861 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:13 compute-0 nova_compute[186662]: 2026-02-19 19:36:13.935 186666 DEBUG nova.compute.manager [req-bf8c0968-f0c1-450d-a96e-056de146ce2a req-fb6c8f5d-f0e0-48c4-aa74-f24375edea7f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Received event network-vif-unplugged-8b59ae32-fc0b-48f7-8566-afb3e4db1434 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:36:13 compute-0 nova_compute[186662]: 2026-02-19 19:36:13.936 186666 DEBUG oslo_concurrency.lockutils [req-bf8c0968-f0c1-450d-a96e-056de146ce2a req-fb6c8f5d-f0e0-48c4-aa74-f24375edea7f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:36:13 compute-0 nova_compute[186662]: 2026-02-19 19:36:13.936 186666 DEBUG oslo_concurrency.lockutils [req-bf8c0968-f0c1-450d-a96e-056de146ce2a req-fb6c8f5d-f0e0-48c4-aa74-f24375edea7f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:36:13 compute-0 nova_compute[186662]: 2026-02-19 19:36:13.936 186666 DEBUG oslo_concurrency.lockutils [req-bf8c0968-f0c1-450d-a96e-056de146ce2a req-fb6c8f5d-f0e0-48c4-aa74-f24375edea7f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:36:13 compute-0 nova_compute[186662]: 2026-02-19 19:36:13.936 186666 DEBUG nova.compute.manager [req-bf8c0968-f0c1-450d-a96e-056de146ce2a req-fb6c8f5d-f0e0-48c4-aa74-f24375edea7f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] No waiting events found dispatching network-vif-unplugged-8b59ae32-fc0b-48f7-8566-afb3e4db1434 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:36:13 compute-0 nova_compute[186662]: 2026-02-19 19:36:13.937 186666 DEBUG nova.compute.manager [req-bf8c0968-f0c1-450d-a96e-056de146ce2a req-fb6c8f5d-f0e0-48c4-aa74-f24375edea7f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Received event network-vif-unplugged-8b59ae32-fc0b-48f7-8566-afb3e4db1434 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:36:14 compute-0 nova_compute[186662]: 2026-02-19 19:36:14.070 186666 DEBUG nova.network.neutron [-] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:36:14 compute-0 nova_compute[186662]: 2026-02-19 19:36:14.120 186666 DEBUG nova.compute.manager [req-bbfe6a92-3a27-4ba3-aa69-2b4e256beed7 req-f8432d09-70d9-456a-a6c1-bf3d8eba2453 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Detach interface failed, port_id=8b59ae32-fc0b-48f7-8566-afb3e4db1434, reason: Instance ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:36:14 compute-0 nova_compute[186662]: 2026-02-19 19:36:14.577 186666 INFO nova.compute.manager [-] [instance: ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc] Took 1.91 seconds to deallocate network for instance.
Feb 19 19:36:15 compute-0 nova_compute[186662]: 2026-02-19 19:36:15.102 186666 DEBUG oslo_concurrency.lockutils [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:36:15 compute-0 nova_compute[186662]: 2026-02-19 19:36:15.102 186666 DEBUG oslo_concurrency.lockutils [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:36:15 compute-0 nova_compute[186662]: 2026-02-19 19:36:15.157 186666 DEBUG nova.compute.provider_tree [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:36:15 compute-0 nova_compute[186662]: 2026-02-19 19:36:15.663 186666 DEBUG nova.scheduler.client.report [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:36:16 compute-0 nova_compute[186662]: 2026-02-19 19:36:16.171 186666 DEBUG oslo_concurrency.lockutils [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.069s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:36:16 compute-0 nova_compute[186662]: 2026-02-19 19:36:16.205 186666 INFO nova.scheduler.client.report [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Deleted allocations for instance ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc
Feb 19 19:36:16 compute-0 podman[212897]: 2026-02-19 19:36:16.2917522 +0000 UTC m=+0.063765481 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1770267347, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter)
Feb 19 19:36:16 compute-0 nova_compute[186662]: 2026-02-19 19:36:16.486 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:36:16 compute-0 nova_compute[186662]: 2026-02-19 19:36:16.486 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:36:16 compute-0 nova_compute[186662]: 2026-02-19 19:36:16.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:36:16 compute-0 nova_compute[186662]: 2026-02-19 19:36:16.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:36:17 compute-0 nova_compute[186662]: 2026-02-19 19:36:17.156 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:17 compute-0 nova_compute[186662]: 2026-02-19 19:36:17.244 186666 DEBUG oslo_concurrency.lockutils [None req-581fd343-f091-431f-b206-4b08106506ec f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "ddcbd1b4-af8c-4f3f-bdb6-82aa3e9fbbdc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.377s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:36:17 compute-0 nova_compute[186662]: 2026-02-19 19:36:17.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:36:17 compute-0 nova_compute[186662]: 2026-02-19 19:36:17.957 186666 DEBUG oslo_concurrency.lockutils [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "df9e632d-e014-4389-bc9d-471df8d0131c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:36:17 compute-0 nova_compute[186662]: 2026-02-19 19:36:17.957 186666 DEBUG oslo_concurrency.lockutils [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "df9e632d-e014-4389-bc9d-471df8d0131c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:36:17 compute-0 nova_compute[186662]: 2026-02-19 19:36:17.958 186666 DEBUG oslo_concurrency.lockutils [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "df9e632d-e014-4389-bc9d-471df8d0131c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:36:17 compute-0 nova_compute[186662]: 2026-02-19 19:36:17.958 186666 DEBUG oslo_concurrency.lockutils [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "df9e632d-e014-4389-bc9d-471df8d0131c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:36:17 compute-0 nova_compute[186662]: 2026-02-19 19:36:17.959 186666 DEBUG oslo_concurrency.lockutils [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "df9e632d-e014-4389-bc9d-471df8d0131c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:36:17 compute-0 nova_compute[186662]: 2026-02-19 19:36:17.974 186666 INFO nova.compute.manager [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Terminating instance
Feb 19 19:36:18 compute-0 podman[212919]: 2026-02-19 19:36:18.289511132 +0000 UTC m=+0.064802865 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.497 186666 DEBUG nova.compute.manager [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:36:18 compute-0 kernel: tap015734ec-d8 (unregistering): left promiscuous mode
Feb 19 19:36:18 compute-0 NetworkManager[56519]: <info>  [1771529778.5201] device (tap015734ec-d8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.522 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:18 compute-0 ovn_controller[96653]: 2026-02-19T19:36:18Z|00137|binding|INFO|Releasing lport 015734ec-d897-4090-8bc9-7d5e71101ac0 from this chassis (sb_readonly=0)
Feb 19 19:36:18 compute-0 ovn_controller[96653]: 2026-02-19T19:36:18Z|00138|binding|INFO|Setting lport 015734ec-d897-4090-8bc9-7d5e71101ac0 down in Southbound
Feb 19 19:36:18 compute-0 ovn_controller[96653]: 2026-02-19T19:36:18Z|00139|binding|INFO|Removing iface tap015734ec-d8 ovn-installed in OVS
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.525 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.530 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:bd:9f 10.100.0.10'], port_security=['fa:16:3e:d9:bd:9f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'df9e632d-e014-4389-bc9d-471df8d0131c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '87043e904d374d2fbc50a010c14c8987', 'neutron:revision_number': '14', 'neutron:security_group_ids': '624104e9-162a-4121-9113-49a4c95099e4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a17ccb-74c9-404e-969f-7001a470c860, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=015734ec-d897-4090-8bc9-7d5e71101ac0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.531 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 015734ec-d897-4090-8bc9-7d5e71101ac0 in datapath f37b00da-2392-46ae-ac87-2c54ab8961a2 unbound from our chassis
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.531 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.532 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f37b00da-2392-46ae-ac87-2c54ab8961a2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.533 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[607a3ac9-1eee-4f30-8ff9-e03d1771169a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.534 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2 namespace which is not needed anymore
Feb 19 19:36:18 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Feb 19 19:36:18 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000e.scope: Consumed 2.370s CPU time.
Feb 19 19:36:18 compute-0 systemd-machined[156014]: Machine qemu-12-instance-0000000e terminated.
Feb 19 19:36:18 compute-0 podman[212970]: 2026-02-19 19:36:18.665869479 +0000 UTC m=+0.032083706 container kill 582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 19 19:36:18 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[212607]: [NOTICE]   (212612) : haproxy version is 3.0.5-8e879a5
Feb 19 19:36:18 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[212607]: [NOTICE]   (212612) : path to executable is /usr/sbin/haproxy
Feb 19 19:36:18 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[212607]: [WARNING]  (212612) : Exiting Master process...
Feb 19 19:36:18 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[212607]: [ALERT]    (212612) : Current worker (212614) exited with code 143 (Terminated)
Feb 19 19:36:18 compute-0 neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2[212607]: [WARNING]  (212612) : All workers exited. Exiting... (0)
Feb 19 19:36:18 compute-0 systemd[1]: libpod-582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc.scope: Deactivated successfully.
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.681 186666 DEBUG nova.compute.manager [req-12d4fa1f-0169-432e-85d5-92936f7febb9 req-38fd2fb3-14a1-497a-97f0-aa6b17ab50bb 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Received event network-vif-unplugged-015734ec-d897-4090-8bc9-7d5e71101ac0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.681 186666 DEBUG oslo_concurrency.lockutils [req-12d4fa1f-0169-432e-85d5-92936f7febb9 req-38fd2fb3-14a1-497a-97f0-aa6b17ab50bb 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "df9e632d-e014-4389-bc9d-471df8d0131c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.681 186666 DEBUG oslo_concurrency.lockutils [req-12d4fa1f-0169-432e-85d5-92936f7febb9 req-38fd2fb3-14a1-497a-97f0-aa6b17ab50bb 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df9e632d-e014-4389-bc9d-471df8d0131c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.681 186666 DEBUG oslo_concurrency.lockutils [req-12d4fa1f-0169-432e-85d5-92936f7febb9 req-38fd2fb3-14a1-497a-97f0-aa6b17ab50bb 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df9e632d-e014-4389-bc9d-471df8d0131c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.681 186666 DEBUG nova.compute.manager [req-12d4fa1f-0169-432e-85d5-92936f7febb9 req-38fd2fb3-14a1-497a-97f0-aa6b17ab50bb 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] No waiting events found dispatching network-vif-unplugged-015734ec-d897-4090-8bc9-7d5e71101ac0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.682 186666 DEBUG nova.compute.manager [req-12d4fa1f-0169-432e-85d5-92936f7febb9 req-38fd2fb3-14a1-497a-97f0-aa6b17ab50bb 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Received event network-vif-unplugged-015734ec-d897-4090-8bc9-7d5e71101ac0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:36:18 compute-0 podman[212986]: 2026-02-19 19:36:18.705984728 +0000 UTC m=+0.021459790 container died 582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.43.0)
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.757 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc-userdata-shm.mount: Deactivated successfully.
Feb 19 19:36:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-aac5edc62b3b86fdf0ab164426329eb68194ee646dea4d4c3640f118f3e48414-merged.mount: Deactivated successfully.
Feb 19 19:36:18 compute-0 podman[212986]: 2026-02-19 19:36:18.777963345 +0000 UTC m=+0.093438407 container cleanup 582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Feb 19 19:36:18 compute-0 systemd[1]: libpod-conmon-582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc.scope: Deactivated successfully.
Feb 19 19:36:18 compute-0 podman[212987]: 2026-02-19 19:36:18.792917787 +0000 UTC m=+0.100086388 container remove 582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.792 186666 INFO nova.virt.libvirt.driver [-] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Instance destroyed successfully.
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.794 186666 DEBUG nova.objects.instance [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lazy-loading 'resources' on Instance uuid df9e632d-e014-4389-bc9d-471df8d0131c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.797 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ea8d4aca-b821-419b-b14f-20d5fca6a037]: (4, ("Thu Feb 19 07:36:18 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2 (582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc)\n582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc\nThu Feb 19 07:36:18 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2 (582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc)\n582ab52d9acfeb423aadc86b7f222e2e43d1bfeaeae439156cef537c888b27fc\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.798 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[0b551de2-f86c-4954-9b63-464907ca3259]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.799 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f37b00da-2392-46ae-ac87-2c54ab8961a2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.799 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a57b1cb0-3cc0-4170-ac96-0be0b353f138]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.800 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf37b00da-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.802 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:18 compute-0 kernel: tapf37b00da-20: left promiscuous mode
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.809 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.811 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[9c474913-31fa-44a2-915e-e94ac198fe2c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.828 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[aacba290-6e45-43e8-b67a-1200c37c6f59]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.829 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f990bfde-472b-4c39-a8f8-592ec259166a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.838 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7fd86ba4-e04d-4947-a382-da8ccc11230c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398861, 'reachable_time': 21581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213034, 'error': None, 'target': 'ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:18 compute-0 systemd[1]: run-netns-ovnmeta\x2df37b00da\x2d2392\x2d46ae\x2dac87\x2d2c54ab8961a2.mount: Deactivated successfully.
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.843 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f37b00da-2392-46ae-ac87-2c54ab8961a2 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:36:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:18.845 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[5da156b0-c3fd-4715-a7c1-fb03910e780a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:18 compute-0 nova_compute[186662]: 2026-02-19 19:36:18.861 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.308 186666 DEBUG nova.virt.libvirt.vif [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-02-19T19:34:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-2073515523',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-2073515523',id=14,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:35:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='87043e904d374d2fbc50a010c14c8987',ramdisk_id='',reservation_id='r-yx5r32yr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-488758453',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-488758453-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:36:00Z,user_data=None,user_id='f74ba8e1becb4d8f83bb148785aac310',uuid=df9e632d-e014-4389-bc9d-471df8d0131c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "015734ec-d897-4090-8bc9-7d5e71101ac0", "address": "fa:16:3e:d9:bd:9f", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap015734ec-d8", "ovs_interfaceid": "015734ec-d897-4090-8bc9-7d5e71101ac0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.308 186666 DEBUG nova.network.os_vif_util [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converting VIF {"id": "015734ec-d897-4090-8bc9-7d5e71101ac0", "address": "fa:16:3e:d9:bd:9f", "network": {"id": "f37b00da-2392-46ae-ac87-2c54ab8961a2", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1323113689-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61a8fd888cc1408eaeded54a293416ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap015734ec-d8", "ovs_interfaceid": "015734ec-d897-4090-8bc9-7d5e71101ac0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.309 186666 DEBUG nova.network.os_vif_util [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d9:bd:9f,bridge_name='br-int',has_traffic_filtering=True,id=015734ec-d897-4090-8bc9-7d5e71101ac0,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap015734ec-d8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.309 186666 DEBUG os_vif [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:bd:9f,bridge_name='br-int',has_traffic_filtering=True,id=015734ec-d897-4090-8bc9-7d5e71101ac0,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap015734ec-d8') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.310 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.310 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap015734ec-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.312 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.313 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.314 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.315 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.315 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=37b5eedd-0574-4c47-a837-54a621fd49cb) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.316 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.317 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.319 186666 INFO os_vif [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:bd:9f,bridge_name='br-int',has_traffic_filtering=True,id=015734ec-d897-4090-8bc9-7d5e71101ac0,network=Network(f37b00da-2392-46ae-ac87-2c54ab8961a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap015734ec-d8')
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.319 186666 INFO nova.virt.libvirt.driver [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Deleting instance files /var/lib/nova/instances/df9e632d-e014-4389-bc9d-471df8d0131c_del
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.320 186666 INFO nova.virt.libvirt.driver [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Deletion of /var/lib/nova/instances/df9e632d-e014-4389-bc9d-471df8d0131c_del complete
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.830 186666 INFO nova.compute.manager [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Took 1.33 seconds to destroy the instance on the hypervisor.
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.830 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.831 186666 DEBUG nova.compute.manager [-] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.831 186666 DEBUG nova.network.neutron [-] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:36:19 compute-0 nova_compute[186662]: 2026-02-19 19:36:19.832 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:36:19 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:19.942 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:36:20 compute-0 nova_compute[186662]: 2026-02-19 19:36:20.390 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:36:20 compute-0 nova_compute[186662]: 2026-02-19 19:36:20.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:36:20 compute-0 nova_compute[186662]: 2026-02-19 19:36:20.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:36:20 compute-0 nova_compute[186662]: 2026-02-19 19:36:20.730 186666 DEBUG nova.compute.manager [req-a2aca3e0-14fc-4176-b04c-2f3228397440 req-815285e5-17d5-42c8-9598-e663f8686320 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Received event network-vif-unplugged-015734ec-d897-4090-8bc9-7d5e71101ac0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:36:20 compute-0 nova_compute[186662]: 2026-02-19 19:36:20.731 186666 DEBUG oslo_concurrency.lockutils [req-a2aca3e0-14fc-4176-b04c-2f3228397440 req-815285e5-17d5-42c8-9598-e663f8686320 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "df9e632d-e014-4389-bc9d-471df8d0131c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:36:20 compute-0 nova_compute[186662]: 2026-02-19 19:36:20.731 186666 DEBUG oslo_concurrency.lockutils [req-a2aca3e0-14fc-4176-b04c-2f3228397440 req-815285e5-17d5-42c8-9598-e663f8686320 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df9e632d-e014-4389-bc9d-471df8d0131c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:36:20 compute-0 nova_compute[186662]: 2026-02-19 19:36:20.732 186666 DEBUG oslo_concurrency.lockutils [req-a2aca3e0-14fc-4176-b04c-2f3228397440 req-815285e5-17d5-42c8-9598-e663f8686320 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "df9e632d-e014-4389-bc9d-471df8d0131c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:36:20 compute-0 nova_compute[186662]: 2026-02-19 19:36:20.732 186666 DEBUG nova.compute.manager [req-a2aca3e0-14fc-4176-b04c-2f3228397440 req-815285e5-17d5-42c8-9598-e663f8686320 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] No waiting events found dispatching network-vif-unplugged-015734ec-d897-4090-8bc9-7d5e71101ac0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:36:20 compute-0 nova_compute[186662]: 2026-02-19 19:36:20.732 186666 DEBUG nova.compute.manager [req-a2aca3e0-14fc-4176-b04c-2f3228397440 req-815285e5-17d5-42c8-9598-e663f8686320 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Received event network-vif-unplugged-015734ec-d897-4090-8bc9-7d5e71101ac0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:36:20 compute-0 nova_compute[186662]: 2026-02-19 19:36:20.733 186666 DEBUG nova.compute.manager [req-a2aca3e0-14fc-4176-b04c-2f3228397440 req-815285e5-17d5-42c8-9598-e663f8686320 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Received event network-vif-deleted-015734ec-d897-4090-8bc9-7d5e71101ac0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:36:20 compute-0 nova_compute[186662]: 2026-02-19 19:36:20.733 186666 INFO nova.compute.manager [req-a2aca3e0-14fc-4176-b04c-2f3228397440 req-815285e5-17d5-42c8-9598-e663f8686320 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Neutron deleted interface 015734ec-d897-4090-8bc9-7d5e71101ac0; detaching it from the instance and deleting it from the info cache
Feb 19 19:36:20 compute-0 nova_compute[186662]: 2026-02-19 19:36:20.734 186666 DEBUG nova.network.neutron [req-a2aca3e0-14fc-4176-b04c-2f3228397440 req-815285e5-17d5-42c8-9598-e663f8686320 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:36:21 compute-0 nova_compute[186662]: 2026-02-19 19:36:21.111 186666 DEBUG nova.network.neutron [-] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:36:21 compute-0 nova_compute[186662]: 2026-02-19 19:36:21.242 186666 DEBUG nova.compute.manager [req-a2aca3e0-14fc-4176-b04c-2f3228397440 req-815285e5-17d5-42c8-9598-e663f8686320 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Detach interface failed, port_id=015734ec-d897-4090-8bc9-7d5e71101ac0, reason: Instance df9e632d-e014-4389-bc9d-471df8d0131c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:36:21 compute-0 nova_compute[186662]: 2026-02-19 19:36:21.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:36:21 compute-0 nova_compute[186662]: 2026-02-19 19:36:21.615 186666 INFO nova.compute.manager [-] [instance: df9e632d-e014-4389-bc9d-471df8d0131c] Took 1.78 seconds to deallocate network for instance.
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.085 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.131 186666 DEBUG oslo_concurrency.lockutils [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.131 186666 DEBUG oslo_concurrency.lockutils [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.135 186666 DEBUG oslo_concurrency.lockutils [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.264 186666 INFO nova.scheduler.client.report [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Deleted allocations for instance df9e632d-e014-4389-bc9d-471df8d0131c
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.281 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.282 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.296 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.297 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5832MB free_disk=72.97665786743164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.297 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:36:22 compute-0 nova_compute[186662]: 2026-02-19 19:36:22.297 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:36:23 compute-0 podman[213037]: 2026-02-19 19:36:23.279026598 +0000 UTC m=+0.054945277 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:36:23 compute-0 nova_compute[186662]: 2026-02-19 19:36:23.295 186666 DEBUG oslo_concurrency.lockutils [None req-e35cb0b8-6670-40fb-91c4-8db00f06cbe0 f74ba8e1becb4d8f83bb148785aac310 87043e904d374d2fbc50a010c14c8987 - - default default] Lock "df9e632d-e014-4389-bc9d-471df8d0131c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.337s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:36:23 compute-0 nova_compute[186662]: 2026-02-19 19:36:23.377 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:36:23 compute-0 nova_compute[186662]: 2026-02-19 19:36:23.378 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:36:22 up  1:07,  0 user,  load average: 0.47, 0.38, 0.37\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:36:23 compute-0 nova_compute[186662]: 2026-02-19 19:36:23.420 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:36:23 compute-0 nova_compute[186662]: 2026-02-19 19:36:23.862 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:23 compute-0 nova_compute[186662]: 2026-02-19 19:36:23.926 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:36:24 compute-0 nova_compute[186662]: 2026-02-19 19:36:24.317 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:24 compute-0 nova_compute[186662]: 2026-02-19 19:36:24.434 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:36:24 compute-0 nova_compute[186662]: 2026-02-19 19:36:24.435 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.137s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:36:27 compute-0 nova_compute[186662]: 2026-02-19 19:36:27.362 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:36:28 compute-0 nova_compute[186662]: 2026-02-19 19:36:28.865 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:28 compute-0 nova_compute[186662]: 2026-02-19 19:36:28.914 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:29 compute-0 nova_compute[186662]: 2026-02-19 19:36:29.319 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:29 compute-0 podman[196025]: time="2026-02-19T19:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:36:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:36:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2194 "" "Go-http-client/1.1"
Feb 19 19:36:31 compute-0 openstack_network_exporter[198916]: ERROR   19:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:36:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:36:31 compute-0 openstack_network_exporter[198916]: ERROR   19:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:36:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:36:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:32.136 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:36:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:32.136 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:36:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:32.137 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:36:33 compute-0 nova_compute[186662]: 2026-02-19 19:36:33.866 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:34 compute-0 nova_compute[186662]: 2026-02-19 19:36:34.321 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:38 compute-0 nova_compute[186662]: 2026-02-19 19:36:38.903 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:39 compute-0 nova_compute[186662]: 2026-02-19 19:36:39.323 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:40.069 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:22:a3 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a82810a912eb44f4bbe52dfbc8765740', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ccf569e3-981f-424a-aec6-af69ea494142, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=96538c20-f183-445c-bcae-80553c2c1fd2) old=Port_Binding(mac=['fa:16:3e:32:22:a3'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a82810a912eb44f4bbe52dfbc8765740', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:36:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:40.070 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 96538c20-f183-445c-bcae-80553c2c1fd2 in datapath a8815e8f-3945-4eb5-98d8-e31bbb7c874a updated
Feb 19 19:36:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:40.071 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a8815e8f-3945-4eb5-98d8-e31bbb7c874a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:36:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:40.071 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[33304bc3-c8ae-4c71-b2c0-f9609bdab3cd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:42 compute-0 podman[213062]: 2026-02-19 19:36:42.278712369 +0000 UTC m=+0.052297154 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:36:43 compute-0 nova_compute[186662]: 2026-02-19 19:36:43.908 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:44 compute-0 nova_compute[186662]: 2026-02-19 19:36:44.324 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:46.805 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:b1:01 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d235eadd-16f9-4e19-a93d-487a05586717', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d235eadd-16f9-4e19-a93d-487a05586717', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '673c40a9f52d4914b8ae3fc458b05edf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=01a39936-5d7b-4538-b9d8-3c92f87f592c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=65e37e84-4126-4535-b26e-3d88ae2a0c90) old=Port_Binding(mac=['fa:16:3e:d7:b1:01'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d235eadd-16f9-4e19-a93d-487a05586717', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d235eadd-16f9-4e19-a93d-487a05586717', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '673c40a9f52d4914b8ae3fc458b05edf', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:36:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:46.805 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 65e37e84-4126-4535-b26e-3d88ae2a0c90 in datapath d235eadd-16f9-4e19-a93d-487a05586717 updated
Feb 19 19:36:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:46.806 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d235eadd-16f9-4e19-a93d-487a05586717, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:36:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:36:46.806 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e4b151-2ed3-4eef-a575-933a1f90481f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:36:47 compute-0 podman[213082]: 2026-02-19 19:36:47.306440225 +0000 UTC m=+0.079738930 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1770267347, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, vcs-type=git)
Feb 19 19:36:48 compute-0 nova_compute[186662]: 2026-02-19 19:36:48.912 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:49 compute-0 nova_compute[186662]: 2026-02-19 19:36:49.326 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:49 compute-0 podman[213103]: 2026-02-19 19:36:49.343249836 +0000 UTC m=+0.110832576 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 19 19:36:53 compute-0 sshd-session[213129]: Invalid user claude from 45.169.200.254 port 57076
Feb 19 19:36:53 compute-0 podman[213131]: 2026-02-19 19:36:53.450741079 +0000 UTC m=+0.046391499 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:36:53 compute-0 sshd-session[213129]: Received disconnect from 45.169.200.254 port 57076:11: Bye Bye [preauth]
Feb 19 19:36:53 compute-0 sshd-session[213129]: Disconnected from invalid user claude 45.169.200.254 port 57076 [preauth]
Feb 19 19:36:53 compute-0 nova_compute[186662]: 2026-02-19 19:36:53.915 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:54 compute-0 nova_compute[186662]: 2026-02-19 19:36:54.328 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:58 compute-0 nova_compute[186662]: 2026-02-19 19:36:58.969 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:59 compute-0 nova_compute[186662]: 2026-02-19 19:36:59.331 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:36:59 compute-0 podman[196025]: time="2026-02-19T19:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:36:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:36:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2194 "" "Go-http-client/1.1"
Feb 19 19:37:01 compute-0 openstack_network_exporter[198916]: ERROR   19:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:37:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:37:01 compute-0 openstack_network_exporter[198916]: ERROR   19:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:37:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:37:01 compute-0 ovn_controller[96653]: 2026-02-19T19:37:01Z|00140|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 19 19:37:03 compute-0 nova_compute[186662]: 2026-02-19 19:37:03.971 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:04 compute-0 nova_compute[186662]: 2026-02-19 19:37:04.334 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:08 compute-0 nova_compute[186662]: 2026-02-19 19:37:08.974 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:09 compute-0 nova_compute[186662]: 2026-02-19 19:37:09.336 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:13 compute-0 podman[213155]: 2026-02-19 19:37:13.259167485 +0000 UTC m=+0.040387672 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:37:13 compute-0 nova_compute[186662]: 2026-02-19 19:37:13.976 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:14 compute-0 nova_compute[186662]: 2026-02-19 19:37:14.338 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:15 compute-0 nova_compute[186662]: 2026-02-19 19:37:15.083 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:37:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:15.708 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:37:15 compute-0 nova_compute[186662]: 2026-02-19 19:37:15.708 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:15.708 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:37:17 compute-0 nova_compute[186662]: 2026-02-19 19:37:17.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:37:17 compute-0 nova_compute[186662]: 2026-02-19 19:37:17.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:37:18 compute-0 podman[213174]: 2026-02-19 19:37:18.266364674 +0000 UTC m=+0.047106037 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 19 19:37:18 compute-0 nova_compute[186662]: 2026-02-19 19:37:18.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:37:18 compute-0 nova_compute[186662]: 2026-02-19 19:37:18.576 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:37:18 compute-0 nova_compute[186662]: 2026-02-19 19:37:18.978 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:19 compute-0 nova_compute[186662]: 2026-02-19 19:37:19.340 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:20 compute-0 podman[213195]: 2026-02-19 19:37:20.293582751 +0000 UTC m=+0.073442187 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, config_id=ovn_controller)
Feb 19 19:37:20 compute-0 nova_compute[186662]: 2026-02-19 19:37:20.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:37:20 compute-0 nova_compute[186662]: 2026-02-19 19:37:20.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:37:21 compute-0 nova_compute[186662]: 2026-02-19 19:37:21.420 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:37:21 compute-0 nova_compute[186662]: 2026-02-19 19:37:21.420 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:37:21 compute-0 nova_compute[186662]: 2026-02-19 19:37:21.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:37:21 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:21.709 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:37:21 compute-0 nova_compute[186662]: 2026-02-19 19:37:21.924 186666 DEBUG nova.compute.manager [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Feb 19 19:37:22 compute-0 nova_compute[186662]: 2026-02-19 19:37:22.487 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:37:22 compute-0 nova_compute[186662]: 2026-02-19 19:37:22.487 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:37:22 compute-0 nova_compute[186662]: 2026-02-19 19:37:22.492 186666 DEBUG nova.virt.hardware [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:37:22 compute-0 nova_compute[186662]: 2026-02-19 19:37:22.493 186666 INFO nova.compute.claims [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:37:23 compute-0 nova_compute[186662]: 2026-02-19 19:37:23.568 186666 DEBUG nova.compute.provider_tree [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:37:23 compute-0 nova_compute[186662]: 2026-02-19 19:37:23.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.030 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.074 186666 DEBUG nova.scheduler.client.report [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.081 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:37:24 compute-0 podman[213222]: 2026-02-19 19:37:24.293808167 +0000 UTC m=+0.070197579 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.342 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.584 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.586 186666 DEBUG nova.compute.manager [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.589 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.589 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.590 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.590 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.724 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.725 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.736 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.737 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5835MB free_disk=72.97665786743164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.737 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:37:24 compute-0 nova_compute[186662]: 2026-02-19 19:37:24.737 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:37:25 compute-0 nova_compute[186662]: 2026-02-19 19:37:25.097 186666 DEBUG nova.compute.manager [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Feb 19 19:37:25 compute-0 nova_compute[186662]: 2026-02-19 19:37:25.097 186666 DEBUG nova.network.neutron [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Feb 19 19:37:25 compute-0 nova_compute[186662]: 2026-02-19 19:37:25.098 186666 WARNING neutronclient.v2_0.client [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:37:25 compute-0 nova_compute[186662]: 2026-02-19 19:37:25.098 186666 WARNING neutronclient.v2_0.client [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:37:25 compute-0 nova_compute[186662]: 2026-02-19 19:37:25.608 186666 INFO nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 19 19:37:25 compute-0 nova_compute[186662]: 2026-02-19 19:37:25.776 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 31cc0f13-bb90-49cf-b9a0-e018920aabeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:37:25 compute-0 nova_compute[186662]: 2026-02-19 19:37:25.776 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:37:25 compute-0 nova_compute[186662]: 2026-02-19 19:37:25.777 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:37:24 up  1:08,  0 user,  load average: 0.21, 0.32, 0.35\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_networking': '1', 'num_os_type_None': '1', 'num_proj_673c40a9f52d4914b8ae3fc458b05edf': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:37:25 compute-0 nova_compute[186662]: 2026-02-19 19:37:25.833 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:37:25 compute-0 nova_compute[186662]: 2026-02-19 19:37:25.922 186666 DEBUG nova.network.neutron [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Successfully created port: ed766c0f-2a6a-47e2-b4a2-456e041040d2 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Feb 19 19:37:26 compute-0 nova_compute[186662]: 2026-02-19 19:37:26.115 186666 DEBUG nova.compute.manager [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Feb 19 19:37:26 compute-0 nova_compute[186662]: 2026-02-19 19:37:26.341 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:37:26 compute-0 nova_compute[186662]: 2026-02-19 19:37:26.852 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:37:26 compute-0 nova_compute[186662]: 2026-02-19 19:37:26.852 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.141 186666 DEBUG nova.compute.manager [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.143 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.144 186666 INFO nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Creating image(s)
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.145 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.145 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.147 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.148 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.154 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.156 186666 DEBUG oslo_concurrency.processutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.233 186666 DEBUG oslo_concurrency.processutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.234 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.235 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.236 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.242 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.243 186666 DEBUG oslo_concurrency.processutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.287 186666 DEBUG oslo_concurrency.processutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.289 186666 DEBUG oslo_concurrency.processutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.316 186666 DEBUG oslo_concurrency.processutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.317 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.081s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.317 186666 DEBUG oslo_concurrency.processutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.369 186666 DEBUG oslo_concurrency.processutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.370 186666 DEBUG nova.virt.disk.api [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Checking if we can resize image /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.371 186666 DEBUG oslo_concurrency.processutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.420 186666 DEBUG oslo_concurrency.processutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.421 186666 DEBUG nova.virt.disk.api [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Cannot resize image /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.422 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.422 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Ensure instance console log exists: /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.422 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.423 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.423 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.780 186666 DEBUG nova.network.neutron [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Successfully updated port: ed766c0f-2a6a-47e2-b4a2-456e041040d2 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.860 186666 DEBUG nova.compute.manager [req-4f3ca1ff-c58d-4a88-9dba-ad66efae9e08 req-47103683-7aeb-4cf8-b335-fb774d952435 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-changed-ed766c0f-2a6a-47e2-b4a2-456e041040d2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.861 186666 DEBUG nova.compute.manager [req-4f3ca1ff-c58d-4a88-9dba-ad66efae9e08 req-47103683-7aeb-4cf8-b335-fb774d952435 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Refreshing instance network info cache due to event network-changed-ed766c0f-2a6a-47e2-b4a2-456e041040d2. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.861 186666 DEBUG oslo_concurrency.lockutils [req-4f3ca1ff-c58d-4a88-9dba-ad66efae9e08 req-47103683-7aeb-4cf8-b335-fb774d952435 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-31cc0f13-bb90-49cf-b9a0-e018920aabeb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.861 186666 DEBUG oslo_concurrency.lockutils [req-4f3ca1ff-c58d-4a88-9dba-ad66efae9e08 req-47103683-7aeb-4cf8-b335-fb774d952435 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-31cc0f13-bb90-49cf-b9a0-e018920aabeb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:37:27 compute-0 nova_compute[186662]: 2026-02-19 19:37:27.861 186666 DEBUG nova.network.neutron [req-4f3ca1ff-c58d-4a88-9dba-ad66efae9e08 req-47103683-7aeb-4cf8-b335-fb774d952435 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Refreshing network info cache for port ed766c0f-2a6a-47e2-b4a2-456e041040d2 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:37:28 compute-0 nova_compute[186662]: 2026-02-19 19:37:28.285 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "refresh_cache-31cc0f13-bb90-49cf-b9a0-e018920aabeb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:37:28 compute-0 nova_compute[186662]: 2026-02-19 19:37:28.367 186666 WARNING neutronclient.v2_0.client [req-4f3ca1ff-c58d-4a88-9dba-ad66efae9e08 req-47103683-7aeb-4cf8-b335-fb774d952435 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:37:28 compute-0 nova_compute[186662]: 2026-02-19 19:37:28.839 186666 DEBUG nova.network.neutron [req-4f3ca1ff-c58d-4a88-9dba-ad66efae9e08 req-47103683-7aeb-4cf8-b335-fb774d952435 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:37:29 compute-0 nova_compute[186662]: 2026-02-19 19:37:29.001 186666 DEBUG nova.network.neutron [req-4f3ca1ff-c58d-4a88-9dba-ad66efae9e08 req-47103683-7aeb-4cf8-b335-fb774d952435 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:37:29 compute-0 nova_compute[186662]: 2026-02-19 19:37:29.085 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:29 compute-0 nova_compute[186662]: 2026-02-19 19:37:29.344 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:29 compute-0 nova_compute[186662]: 2026-02-19 19:37:29.507 186666 DEBUG oslo_concurrency.lockutils [req-4f3ca1ff-c58d-4a88-9dba-ad66efae9e08 req-47103683-7aeb-4cf8-b335-fb774d952435 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-31cc0f13-bb90-49cf-b9a0-e018920aabeb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:37:29 compute-0 nova_compute[186662]: 2026-02-19 19:37:29.508 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquired lock "refresh_cache-31cc0f13-bb90-49cf-b9a0-e018920aabeb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:37:29 compute-0 nova_compute[186662]: 2026-02-19 19:37:29.508 186666 DEBUG nova.network.neutron [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:37:29 compute-0 podman[196025]: time="2026-02-19T19:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:37:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:37:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Feb 19 19:37:30 compute-0 nova_compute[186662]: 2026-02-19 19:37:30.409 186666 DEBUG nova.network.neutron [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:37:30 compute-0 nova_compute[186662]: 2026-02-19 19:37:30.711 186666 WARNING neutronclient.v2_0.client [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:37:30 compute-0 nova_compute[186662]: 2026-02-19 19:37:30.952 186666 DEBUG nova.network.neutron [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Updating instance_info_cache with network_info: [{"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:37:31 compute-0 openstack_network_exporter[198916]: ERROR   19:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:37:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:37:31 compute-0 openstack_network_exporter[198916]: ERROR   19:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:37:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.459 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Releasing lock "refresh_cache-31cc0f13-bb90-49cf-b9a0-e018920aabeb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.459 186666 DEBUG nova.compute.manager [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Instance network_info: |[{"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.462 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Start _get_guest_xml network_info=[{"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.465 186666 WARNING nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.467 186666 DEBUG nova.virt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-185556420', uuid='31cc0f13-bb90-49cf-b9a0-e018920aabeb'), owner=OwnerMeta(userid='08866af24c3a4551a81c8099dc9049fb', username='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin', projectid='673c40a9f52d4914b8ae3fc458b05edf', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771529851.4673345) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.471 186666 DEBUG nova.virt.libvirt.host [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.471 186666 DEBUG nova.virt.libvirt.host [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.474 186666 DEBUG nova.virt.libvirt.host [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.474 186666 DEBUG nova.virt.libvirt.host [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.475 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.476 186666 DEBUG nova.virt.hardware [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:33Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.476 186666 DEBUG nova.virt.hardware [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.476 186666 DEBUG nova.virt.hardware [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.477 186666 DEBUG nova.virt.hardware [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.477 186666 DEBUG nova.virt.hardware [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.477 186666 DEBUG nova.virt.hardware [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.477 186666 DEBUG nova.virt.hardware [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.478 186666 DEBUG nova.virt.hardware [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.478 186666 DEBUG nova.virt.hardware [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.478 186666 DEBUG nova.virt.hardware [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.478 186666 DEBUG nova.virt.hardware [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.481 186666 DEBUG nova.virt.libvirt.vif [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:37:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-185556420',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-185',id=17,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='673c40a9f52d4914b8ae3fc458b05edf',ramdisk_id='',reservation_id='r-2ie1ziua',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:37:26Z,user_data=None,user_id='08866af24c3a4551a81c8099dc9049fb',uuid=31cc0f13-bb90-49cf-b9a0-e018920aabeb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.482 186666 DEBUG nova.network.os_vif_util [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Converting VIF {"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.482 186666 DEBUG nova.network.os_vif_util [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:6b:89,bridge_name='br-int',has_traffic_filtering=True,id=ed766c0f-2a6a-47e2-b4a2-456e041040d2,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped766c0f-2a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.483 186666 DEBUG nova.objects.instance [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lazy-loading 'pci_devices' on Instance uuid 31cc0f13-bb90-49cf-b9a0-e018920aabeb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.988 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:37:31 compute-0 nova_compute[186662]:   <uuid>31cc0f13-bb90-49cf-b9a0-e018920aabeb</uuid>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   <name>instance-00000011</name>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   <memory>131072</memory>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-185556420</nova:name>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:37:31</nova:creationTime>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:37:31 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:37:31 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:user uuid="08866af24c3a4551a81c8099dc9049fb">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin</nova:user>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:project uuid="673c40a9f52d4914b8ae3fc458b05edf">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347</nova:project>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         <nova:port uuid="ed766c0f-2a6a-47e2-b4a2-456e041040d2">
Feb 19 19:37:31 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <system>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <entry name="serial">31cc0f13-bb90-49cf-b9a0-e018920aabeb</entry>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <entry name="uuid">31cc0f13-bb90-49cf-b9a0-e018920aabeb</entry>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     </system>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   <os>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   </os>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   <features>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   </features>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk.config"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:04:6b:89"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <target dev="taped766c0f-2a"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/console.log" append="off"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <video>
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     </video>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:37:31 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:37:31 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:37:31 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:37:31 compute-0 nova_compute[186662]: </domain>
Feb 19 19:37:31 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.989 186666 DEBUG nova.compute.manager [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Preparing to wait for external event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.989 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.989 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.989 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.990 186666 DEBUG nova.virt.libvirt.vif [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:37:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-185556420',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-185',id=17,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='673c40a9f52d4914b8ae3fc458b05edf',ramdisk_id='',reservation_id='r-2ie1ziua',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:37:26Z,user_data=None,user_id='08866af24c3a4551a81c8099dc9049fb',uuid=31cc0f13-bb90-49cf-b9a0-e018920aabeb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.991 186666 DEBUG nova.network.os_vif_util [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Converting VIF {"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.991 186666 DEBUG nova.network.os_vif_util [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:6b:89,bridge_name='br-int',has_traffic_filtering=True,id=ed766c0f-2a6a-47e2-b4a2-456e041040d2,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped766c0f-2a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.992 186666 DEBUG os_vif [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:6b:89,bridge_name='br-int',has_traffic_filtering=True,id=ed766c0f-2a6a-47e2-b4a2-456e041040d2,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped766c0f-2a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.992 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.993 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.993 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.994 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.994 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ebf1fe74-5588-5618-8547-e7a206f03f59', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.995 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.996 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.997 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.998 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped766c0f-2a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.998 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=taped766c0f-2a, col_values=(('qos', UUID('39102e8b-e7d8-4480-a8cf-90f42674b4c2')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.998 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=taped766c0f-2a, col_values=(('external_ids', {'iface-id': 'ed766c0f-2a6a-47e2-b4a2-456e041040d2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:6b:89', 'vm-uuid': '31cc0f13-bb90-49cf-b9a0-e018920aabeb'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:37:31 compute-0 nova_compute[186662]: 2026-02-19 19:37:31.999 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:32 compute-0 NetworkManager[56519]: <info>  [1771529852.0000] manager: (taped766c0f-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Feb 19 19:37:32 compute-0 nova_compute[186662]: 2026-02-19 19:37:32.001 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:37:32 compute-0 nova_compute[186662]: 2026-02-19 19:37:32.003 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:32 compute-0 nova_compute[186662]: 2026-02-19 19:37:32.004 186666 INFO os_vif [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:6b:89,bridge_name='br-int',has_traffic_filtering=True,id=ed766c0f-2a6a-47e2-b4a2-456e041040d2,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped766c0f-2a')
Feb 19 19:37:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:32.138 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:37:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:32.138 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:37:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:32.139 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:37:33 compute-0 nova_compute[186662]: 2026-02-19 19:37:33.536 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:37:33 compute-0 nova_compute[186662]: 2026-02-19 19:37:33.537 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:37:33 compute-0 nova_compute[186662]: 2026-02-19 19:37:33.537 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] No VIF found with MAC fa:16:3e:04:6b:89, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:37:33 compute-0 nova_compute[186662]: 2026-02-19 19:37:33.537 186666 INFO nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Using config drive
Feb 19 19:37:34 compute-0 nova_compute[186662]: 2026-02-19 19:37:34.046 186666 WARNING neutronclient.v2_0.client [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:37:34 compute-0 nova_compute[186662]: 2026-02-19 19:37:34.087 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:34 compute-0 nova_compute[186662]: 2026-02-19 19:37:34.818 186666 INFO nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Creating config drive at /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk.config
Feb 19 19:37:34 compute-0 nova_compute[186662]: 2026-02-19 19:37:34.821 186666 DEBUG oslo_concurrency.processutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp1sw4wmxl execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:37:34 compute-0 nova_compute[186662]: 2026-02-19 19:37:34.936 186666 DEBUG oslo_concurrency.processutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp1sw4wmxl" returned: 0 in 0.115s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:37:34 compute-0 kernel: taped766c0f-2a: entered promiscuous mode
Feb 19 19:37:34 compute-0 NetworkManager[56519]: <info>  [1771529854.9916] manager: (taped766c0f-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.042 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:35 compute-0 systemd-udevd[213283]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:37:35 compute-0 ovn_controller[96653]: 2026-02-19T19:37:35Z|00141|binding|INFO|Claiming lport ed766c0f-2a6a-47e2-b4a2-456e041040d2 for this chassis.
Feb 19 19:37:35 compute-0 ovn_controller[96653]: 2026-02-19T19:37:35Z|00142|binding|INFO|ed766c0f-2a6a-47e2-b4a2-456e041040d2: Claiming fa:16:3e:04:6b:89 10.100.0.12
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.049 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:35 compute-0 NetworkManager[56519]: <info>  [1771529855.0539] device (taped766c0f-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:37:35 compute-0 NetworkManager[56519]: <info>  [1771529855.0545] device (taped766c0f-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.057 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:6b:89 10.100.0.12'], port_security=['fa:16:3e:04:6b:89 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '31cc0f13-bb90-49cf-b9a0-e018920aabeb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '673c40a9f52d4914b8ae3fc458b05edf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '21b17461-96ed-47ff-b7be-d33698019865', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ccf569e3-981f-424a-aec6-af69ea494142, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=ed766c0f-2a6a-47e2-b4a2-456e041040d2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.058 105986 INFO neutron.agent.ovn.metadata.agent [-] Port ed766c0f-2a6a-47e2-b4a2-456e041040d2 in datapath a8815e8f-3945-4eb5-98d8-e31bbb7c874a bound to our chassis
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.059 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a8815e8f-3945-4eb5-98d8-e31bbb7c874a
Feb 19 19:37:35 compute-0 ovn_controller[96653]: 2026-02-19T19:37:35Z|00143|binding|INFO|Setting lport ed766c0f-2a6a-47e2-b4a2-456e041040d2 ovn-installed in OVS
Feb 19 19:37:35 compute-0 ovn_controller[96653]: 2026-02-19T19:37:35Z|00144|binding|INFO|Setting lport ed766c0f-2a6a-47e2-b4a2-456e041040d2 up in Southbound
Feb 19 19:37:35 compute-0 systemd-machined[156014]: New machine qemu-13-instance-00000011.
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.068 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.069 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[6c28923c-f49d-43ed-aa46-e669fa505e67]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.069 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa8815e8f-31 in ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.071 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa8815e8f-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.071 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[9e21adfe-dbab-401d-924e-2cfbec559c0b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.072 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[778b816c-99c4-440b-b0f9-804902d0649b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000011.
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.079 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[58c0ae6c-8f3a-4b7c-8ccb-51f4fe1b949c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.083 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[161259a7-c355-43f9-ae4c-c1b341dc849d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.104 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[36f0c74d-05eb-490b-bd9d-525e72fc9be7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.110 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[75edc17b-703a-477e-90d8-9749d42246c7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 systemd-udevd[213287]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:37:35 compute-0 NetworkManager[56519]: <info>  [1771529855.1110] manager: (tapa8815e8f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.134 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[b21accc8-0d64-4364-aacc-da26124c5bb1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.136 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[836fa4ff-0e8a-4b9d-ace9-d7fbb8b78cfc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 NetworkManager[56519]: <info>  [1771529855.1525] device (tapa8815e8f-30): carrier: link connected
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.156 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[84f6c841-97b7-4a10-822d-2e317262744f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.167 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f4f47972-854f-43ea-998e-a2158bbb306c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa8815e8f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:32:22:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412471, 'reachable_time': 33259, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213319, 'error': None, 'target': 'ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.179 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[16ddd6a3-ee93-4a2b-8da7-209d0cfb3414]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe32:22a3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412471, 'tstamp': 412471}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213320, 'error': None, 'target': 'ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.190 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c8532cd2-54c3-4e59-b69d-b8a8550379c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa8815e8f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:32:22:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412471, 'reachable_time': 33259, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213321, 'error': None, 'target': 'ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.209 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[001eabcd-cdcd-42aa-b57c-f2985a1d9430]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.241 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc76baa-def7-43b5-8b4a-dbd6fb78ca98]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.241 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8815e8f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.242 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.242 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa8815e8f-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.243 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:35 compute-0 NetworkManager[56519]: <info>  [1771529855.2440] manager: (tapa8815e8f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Feb 19 19:37:35 compute-0 kernel: tapa8815e8f-30: entered promiscuous mode
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.247 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa8815e8f-30, col_values=(('external_ids', {'iface-id': '96538c20-f183-445c-bcae-80553c2c1fd2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.247 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:35 compute-0 ovn_controller[96653]: 2026-02-19T19:37:35Z|00145|binding|INFO|Releasing lport 96538c20-f183-445c-bcae-80553c2c1fd2 from this chassis (sb_readonly=0)
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.249 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[436edc60-045e-4ec0-8ea8-2d1b15b75edc]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.250 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.250 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.250 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a8815e8f-3945-4eb5-98d8-e31bbb7c874a disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.250 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.251 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[6e75b7f2-e3dd-4c2e-b9ea-96bc0c50d8ec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.251 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.252 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.253 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[75b9a923-ac0a-428a-a067-e28275e568d1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.253 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-a8815e8f-3945-4eb5-98d8-e31bbb7c874a
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID a8815e8f-3945-4eb5-98d8-e31bbb7c874a
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:37:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:37:35.253 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'env', 'PROCESS_TAG=haproxy-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.297 186666 DEBUG nova.compute.manager [req-845574f8-ac79-40b7-b58a-f080180dcdd3 req-0858f4a6-98d9-4815-a69e-7c744e0f08c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.297 186666 DEBUG oslo_concurrency.lockutils [req-845574f8-ac79-40b7-b58a-f080180dcdd3 req-0858f4a6-98d9-4815-a69e-7c744e0f08c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.297 186666 DEBUG oslo_concurrency.lockutils [req-845574f8-ac79-40b7-b58a-f080180dcdd3 req-0858f4a6-98d9-4815-a69e-7c744e0f08c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.298 186666 DEBUG oslo_concurrency.lockutils [req-845574f8-ac79-40b7-b58a-f080180dcdd3 req-0858f4a6-98d9-4815-a69e-7c744e0f08c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.298 186666 DEBUG nova.compute.manager [req-845574f8-ac79-40b7-b58a-f080180dcdd3 req-0858f4a6-98d9-4815-a69e-7c744e0f08c4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Processing event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:37:35 compute-0 sshd-session[213266]: Invalid user n8n from 197.211.55.20 port 42802
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.379 186666 DEBUG nova.compute.manager [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.381 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.384 186666 INFO nova.virt.libvirt.driver [-] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Instance spawned successfully.
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.385 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Feb 19 19:37:35 compute-0 sshd-session[213266]: Received disconnect from 197.211.55.20 port 42802:11: Bye Bye [preauth]
Feb 19 19:37:35 compute-0 sshd-session[213266]: Disconnected from invalid user n8n 197.211.55.20 port 42802 [preauth]
Feb 19 19:37:35 compute-0 podman[213360]: 2026-02-19 19:37:35.606397565 +0000 UTC m=+0.054064065 container create 604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Feb 19 19:37:35 compute-0 systemd[1]: Started libpod-conmon-604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180.scope.
Feb 19 19:37:35 compute-0 podman[213360]: 2026-02-19 19:37:35.569980939 +0000 UTC m=+0.017647449 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:37:35 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:37:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c62c4e64ddd1835091653438a595c64b7f5a54ab93b8733b98a05af02e1e97cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:37:35 compute-0 podman[213360]: 2026-02-19 19:37:35.701055607 +0000 UTC m=+0.148722097 container init 604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Feb 19 19:37:35 compute-0 podman[213360]: 2026-02-19 19:37:35.708327573 +0000 UTC m=+0.155994063 container start 604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:37:35 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[213375]: [NOTICE]   (213379) : New worker (213381) forked
Feb 19 19:37:35 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[213375]: [NOTICE]   (213379) : Loading success.
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.894 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.895 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.895 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.895 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.896 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:37:35 compute-0 nova_compute[186662]: 2026-02-19 19:37:35.896 186666 DEBUG nova.virt.libvirt.driver [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:37:36 compute-0 nova_compute[186662]: 2026-02-19 19:37:36.405 186666 INFO nova.compute.manager [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Took 9.26 seconds to spawn the instance on the hypervisor.
Feb 19 19:37:36 compute-0 nova_compute[186662]: 2026-02-19 19:37:36.405 186666 DEBUG nova.compute.manager [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:37:36 compute-0 nova_compute[186662]: 2026-02-19 19:37:36.959 186666 INFO nova.compute.manager [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Took 14.51 seconds to build instance.
Feb 19 19:37:37 compute-0 nova_compute[186662]: 2026-02-19 19:37:36.999 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:37 compute-0 nova_compute[186662]: 2026-02-19 19:37:37.392 186666 DEBUG nova.compute.manager [req-57559706-2f75-4949-8439-525704908597 req-d2990c90-40e5-4fd9-b67c-17c59b11c832 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:37:37 compute-0 nova_compute[186662]: 2026-02-19 19:37:37.393 186666 DEBUG oslo_concurrency.lockutils [req-57559706-2f75-4949-8439-525704908597 req-d2990c90-40e5-4fd9-b67c-17c59b11c832 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:37:37 compute-0 nova_compute[186662]: 2026-02-19 19:37:37.394 186666 DEBUG oslo_concurrency.lockutils [req-57559706-2f75-4949-8439-525704908597 req-d2990c90-40e5-4fd9-b67c-17c59b11c832 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:37:37 compute-0 nova_compute[186662]: 2026-02-19 19:37:37.394 186666 DEBUG oslo_concurrency.lockutils [req-57559706-2f75-4949-8439-525704908597 req-d2990c90-40e5-4fd9-b67c-17c59b11c832 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:37:37 compute-0 nova_compute[186662]: 2026-02-19 19:37:37.394 186666 DEBUG nova.compute.manager [req-57559706-2f75-4949-8439-525704908597 req-d2990c90-40e5-4fd9-b67c-17c59b11c832 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] No waiting events found dispatching network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:37:37 compute-0 nova_compute[186662]: 2026-02-19 19:37:37.395 186666 WARNING nova.compute.manager [req-57559706-2f75-4949-8439-525704908597 req-d2990c90-40e5-4fd9-b67c-17c59b11c832 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received unexpected event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 for instance with vm_state active and task_state None.
Feb 19 19:37:37 compute-0 nova_compute[186662]: 2026-02-19 19:37:37.470 186666 DEBUG oslo_concurrency.lockutils [None req-aa183f07-0463-4624-8b50-f8c6ea357e52 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.050s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:37:39 compute-0 nova_compute[186662]: 2026-02-19 19:37:39.089 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:42 compute-0 nova_compute[186662]: 2026-02-19 19:37:42.001 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:43 compute-0 nova_compute[186662]: 2026-02-19 19:37:43.818 186666 DEBUG nova.compute.manager [None req-5f8dd811-1d36-4a58-9020-10e624857b60 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Feb 19 19:37:43 compute-0 nova_compute[186662]: 2026-02-19 19:37:43.879 186666 DEBUG nova.compute.provider_tree [None req-5f8dd811-1d36-4a58-9020-10e624857b60 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Updating resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 generation from 17 to 21 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Feb 19 19:37:44 compute-0 nova_compute[186662]: 2026-02-19 19:37:44.091 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:44 compute-0 podman[213390]: 2026-02-19 19:37:44.27784211 +0000 UTC m=+0.046773988 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Feb 19 19:37:47 compute-0 nova_compute[186662]: 2026-02-19 19:37:47.003 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:48 compute-0 ovn_controller[96653]: 2026-02-19T19:37:48Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:04:6b:89 10.100.0.12
Feb 19 19:37:48 compute-0 ovn_controller[96653]: 2026-02-19T19:37:48Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:6b:89 10.100.0.12
Feb 19 19:37:49 compute-0 nova_compute[186662]: 2026-02-19 19:37:49.130 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:49 compute-0 podman[213427]: 2026-02-19 19:37:49.260862541 +0000 UTC m=+0.042349080 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 19 19:37:50 compute-0 nova_compute[186662]: 2026-02-19 19:37:50.322 186666 DEBUG nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Check if temp file /var/lib/nova/instances/tmpjc1vopkt exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Feb 19 19:37:50 compute-0 nova_compute[186662]: 2026-02-19 19:37:50.326 186666 DEBUG nova.compute.manager [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjc1vopkt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='31cc0f13-bb90-49cf-b9a0-e018920aabeb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Feb 19 19:37:51 compute-0 podman[213448]: 2026-02-19 19:37:51.276334213 +0000 UTC m=+0.059236601 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 19:37:51 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 19 19:37:52 compute-0 nova_compute[186662]: 2026-02-19 19:37:52.006 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:54 compute-0 nova_compute[186662]: 2026-02-19 19:37:54.133 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:54 compute-0 nova_compute[186662]: 2026-02-19 19:37:54.923 186666 DEBUG oslo_concurrency.processutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:37:54 compute-0 nova_compute[186662]: 2026-02-19 19:37:54.986 186666 DEBUG oslo_concurrency.processutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:37:54 compute-0 nova_compute[186662]: 2026-02-19 19:37:54.987 186666 DEBUG oslo_concurrency.processutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:37:55 compute-0 nova_compute[186662]: 2026-02-19 19:37:55.029 186666 DEBUG oslo_concurrency.processutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:37:55 compute-0 nova_compute[186662]: 2026-02-19 19:37:55.030 186666 DEBUG nova.compute.manager [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Preparing to wait for external event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:37:55 compute-0 nova_compute[186662]: 2026-02-19 19:37:55.030 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:37:55 compute-0 nova_compute[186662]: 2026-02-19 19:37:55.031 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:37:55 compute-0 nova_compute[186662]: 2026-02-19 19:37:55.031 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:37:55 compute-0 podman[213483]: 2026-02-19 19:37:55.269401935 +0000 UTC m=+0.043877198 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:37:57 compute-0 nova_compute[186662]: 2026-02-19 19:37:57.008 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:59 compute-0 nova_compute[186662]: 2026-02-19 19:37:59.134 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:37:59 compute-0 podman[196025]: time="2026-02-19T19:37:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:37:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:37:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:37:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:37:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2661 "" "Go-http-client/1.1"
Feb 19 19:38:01 compute-0 nova_compute[186662]: 2026-02-19 19:38:01.079 186666 DEBUG nova.compute.manager [req-e6116801-f4ce-440d-bb64-615e785f7940 req-75c42369-4549-415d-b8d8-c329be846924 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-unplugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:38:01 compute-0 nova_compute[186662]: 2026-02-19 19:38:01.079 186666 DEBUG oslo_concurrency.lockutils [req-e6116801-f4ce-440d-bb64-615e785f7940 req-75c42369-4549-415d-b8d8-c329be846924 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:01 compute-0 nova_compute[186662]: 2026-02-19 19:38:01.079 186666 DEBUG oslo_concurrency.lockutils [req-e6116801-f4ce-440d-bb64-615e785f7940 req-75c42369-4549-415d-b8d8-c329be846924 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:01 compute-0 nova_compute[186662]: 2026-02-19 19:38:01.080 186666 DEBUG oslo_concurrency.lockutils [req-e6116801-f4ce-440d-bb64-615e785f7940 req-75c42369-4549-415d-b8d8-c329be846924 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:01 compute-0 nova_compute[186662]: 2026-02-19 19:38:01.080 186666 DEBUG nova.compute.manager [req-e6116801-f4ce-440d-bb64-615e785f7940 req-75c42369-4549-415d-b8d8-c329be846924 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] No event matching network-vif-unplugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 in dict_keys([('network-vif-plugged', 'ed766c0f-2a6a-47e2-b4a2-456e041040d2')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Feb 19 19:38:01 compute-0 nova_compute[186662]: 2026-02-19 19:38:01.080 186666 DEBUG nova.compute.manager [req-e6116801-f4ce-440d-bb64-615e785f7940 req-75c42369-4549-415d-b8d8-c329be846924 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-unplugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:38:01 compute-0 openstack_network_exporter[198916]: ERROR   19:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:38:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:38:01 compute-0 openstack_network_exporter[198916]: ERROR   19:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:38:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:38:01 compute-0 anacron[49500]: Job `cron.daily' started
Feb 19 19:38:01 compute-0 anacron[49500]: Job `cron.daily' terminated
Feb 19 19:38:02 compute-0 nova_compute[186662]: 2026-02-19 19:38:02.010 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:02 compute-0 nova_compute[186662]: 2026-02-19 19:38:02.599 186666 INFO nova.compute.manager [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Took 7.57 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 19 19:38:03 compute-0 nova_compute[186662]: 2026-02-19 19:38:03.173 186666 DEBUG nova.compute.manager [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:38:03 compute-0 nova_compute[186662]: 2026-02-19 19:38:03.173 186666 DEBUG oslo_concurrency.lockutils [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:03 compute-0 nova_compute[186662]: 2026-02-19 19:38:03.173 186666 DEBUG oslo_concurrency.lockutils [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:03 compute-0 nova_compute[186662]: 2026-02-19 19:38:03.173 186666 DEBUG oslo_concurrency.lockutils [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:03 compute-0 nova_compute[186662]: 2026-02-19 19:38:03.174 186666 DEBUG nova.compute.manager [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Processing event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:38:03 compute-0 nova_compute[186662]: 2026-02-19 19:38:03.174 186666 DEBUG nova.compute.manager [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-changed-ed766c0f-2a6a-47e2-b4a2-456e041040d2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:38:03 compute-0 nova_compute[186662]: 2026-02-19 19:38:03.174 186666 DEBUG nova.compute.manager [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Refreshing instance network info cache due to event network-changed-ed766c0f-2a6a-47e2-b4a2-456e041040d2. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:38:03 compute-0 nova_compute[186662]: 2026-02-19 19:38:03.174 186666 DEBUG oslo_concurrency.lockutils [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-31cc0f13-bb90-49cf-b9a0-e018920aabeb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:38:03 compute-0 nova_compute[186662]: 2026-02-19 19:38:03.174 186666 DEBUG oslo_concurrency.lockutils [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-31cc0f13-bb90-49cf-b9a0-e018920aabeb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:38:03 compute-0 nova_compute[186662]: 2026-02-19 19:38:03.175 186666 DEBUG nova.network.neutron [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Refreshing network info cache for port ed766c0f-2a6a-47e2-b4a2-456e041040d2 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:38:03 compute-0 nova_compute[186662]: 2026-02-19 19:38:03.176 186666 DEBUG nova.compute.manager [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:38:03 compute-0 nova_compute[186662]: 2026-02-19 19:38:03.682 186666 WARNING neutronclient.v2_0.client [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:38:03 compute-0 nova_compute[186662]: 2026-02-19 19:38:03.686 186666 DEBUG nova.compute.manager [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjc1vopkt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='31cc0f13-bb90-49cf-b9a0-e018920aabeb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(50b28c67-6a2e-41ca-9408-9a0b465f5e60),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.136 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.200 186666 DEBUG nova.objects.instance [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid 31cc0f13-bb90-49cf-b9a0-e018920aabeb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.201 186666 DEBUG nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.202 186666 DEBUG nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.202 186666 DEBUG nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.704 186666 DEBUG nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.704 186666 DEBUG nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.720 186666 DEBUG nova.virt.libvirt.vif [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:37:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-185556420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-185',id=17,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:37:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='673c40a9f52d4914b8ae3fc458b05edf',ramdisk_id='',reservation_id='r-2ie1ziua',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:37:36Z,user_data=None,user_id='08866af24c3a4551a81c8099dc9049fb',uuid=31cc0f13-bb90-49cf-b9a0-e018920aabeb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.720 186666 DEBUG nova.network.os_vif_util [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.721 186666 DEBUG nova.network.os_vif_util [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:6b:89,bridge_name='br-int',has_traffic_filtering=True,id=ed766c0f-2a6a-47e2-b4a2-456e041040d2,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped766c0f-2a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.721 186666 DEBUG nova.virt.libvirt.migration [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Updating guest XML with vif config: <interface type="ethernet">
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <mac address="fa:16:3e:04:6b:89"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <model type="virtio"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <mtu size="1442"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <target dev="taped766c0f-2a"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]: </interface>
Feb 19 19:38:04 compute-0 nova_compute[186662]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.722 186666 DEBUG nova.virt.libvirt.migration [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <name>instance-00000011</name>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <uuid>31cc0f13-bb90-49cf-b9a0-e018920aabeb</uuid>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-185556420</nova:name>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:37:31</nova:creationTime>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:38:04 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:38:04 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:user uuid="08866af24c3a4551a81c8099dc9049fb">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin</nova:user>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:project uuid="673c40a9f52d4914b8ae3fc458b05edf">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347</nova:project>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:port uuid="ed766c0f-2a6a-47e2-b4a2-456e041040d2">
Feb 19 19:38:04 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <memory unit="KiB">131072</memory>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <currentMemory unit="KiB">131072</currentMemory>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <vcpu placement="static">1</vcpu>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <resource>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <partition>/machine</partition>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </resource>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <system>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="serial">31cc0f13-bb90-49cf-b9a0-e018920aabeb</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="uuid">31cc0f13-bb90-49cf-b9a0-e018920aabeb</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </system>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <os>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </os>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <features>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <vmcoreinfo state="on"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </features>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact" check="partial">
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <model fallback="allow">Nehalem</model>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <on_poweroff>destroy</on_poweroff>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <on_reboot>restart</on_reboot>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <on_crash>destroy</on_crash>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk.config"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <readonly/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="0" model="pcie-root"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="1" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="1" port="0x10"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="2" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="2" port="0x11"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="3" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="3" port="0x12"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="4" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="4" port="0x13"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="5" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="5" port="0x14"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="6" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="6" port="0x15"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="7" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="7" port="0x16"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="8" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="8" port="0x17"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="9" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="9" port="0x18"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="10" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="10" port="0x19"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="11" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="11" port="0x1a"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="12" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="12" port="0x1b"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="13" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="13" port="0x1c"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="14" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="14" port="0x1d"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="15" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="15" port="0x1e"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="16" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="16" port="0x1f"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="17" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="17" port="0x20"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="18" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="18" port="0x21"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="19" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="19" port="0x22"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="20" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="20" port="0x23"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="21" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="21" port="0x24"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="22" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="22" port="0x25"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="23" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="23" port="0x26"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="24" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="24" port="0x27"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="25" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="25" port="0x28"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-pci-bridge"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="usb" index="0" model="piix3-uhci">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="sata" index="0">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <interface type="ethernet"><mac address="fa:16:3e:04:6b:89"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="taped766c0f-2a"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </interface><serial type="pty">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/console.log" append="off"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target type="isa-serial" port="0">
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <model name="isa-serial"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </target>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <console type="pty">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/console.log" append="off"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target type="serial" port="0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </console>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="usb" bus="0" port="1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </input>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <input type="mouse" bus="ps2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <listen type="address" address="::"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <video>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model type="virtio" heads="1" primary="yes"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </video>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]: </domain>
Feb 19 19:38:04 compute-0 nova_compute[186662]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.723 186666 DEBUG nova.virt.libvirt.migration [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <name>instance-00000011</name>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <uuid>31cc0f13-bb90-49cf-b9a0-e018920aabeb</uuid>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-185556420</nova:name>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:37:31</nova:creationTime>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:38:04 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:38:04 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:user uuid="08866af24c3a4551a81c8099dc9049fb">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin</nova:user>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:project uuid="673c40a9f52d4914b8ae3fc458b05edf">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347</nova:project>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:port uuid="ed766c0f-2a6a-47e2-b4a2-456e041040d2">
Feb 19 19:38:04 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <memory unit="KiB">131072</memory>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <currentMemory unit="KiB">131072</currentMemory>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <vcpu placement="static">1</vcpu>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <resource>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <partition>/machine</partition>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </resource>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <system>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="serial">31cc0f13-bb90-49cf-b9a0-e018920aabeb</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="uuid">31cc0f13-bb90-49cf-b9a0-e018920aabeb</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </system>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <os>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </os>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <features>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <vmcoreinfo state="on"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </features>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact" check="partial">
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <model fallback="allow">Nehalem</model>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <on_poweroff>destroy</on_poweroff>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <on_reboot>restart</on_reboot>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <on_crash>destroy</on_crash>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk.config"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <readonly/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="0" model="pcie-root"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="1" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="1" port="0x10"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="2" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="2" port="0x11"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="3" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="3" port="0x12"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="4" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="4" port="0x13"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="5" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="5" port="0x14"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="6" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="6" port="0x15"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="7" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="7" port="0x16"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="8" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="8" port="0x17"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="9" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="9" port="0x18"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="10" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="10" port="0x19"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="11" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="11" port="0x1a"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="12" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="12" port="0x1b"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="13" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="13" port="0x1c"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="14" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="14" port="0x1d"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="15" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="15" port="0x1e"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="16" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="16" port="0x1f"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="17" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="17" port="0x20"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="18" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="18" port="0x21"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="19" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="19" port="0x22"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="20" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="20" port="0x23"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="21" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="21" port="0x24"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="22" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="22" port="0x25"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="23" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="23" port="0x26"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="24" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="24" port="0x27"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="25" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="25" port="0x28"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-pci-bridge"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="usb" index="0" model="piix3-uhci">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="sata" index="0">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <interface type="ethernet"><mac address="fa:16:3e:04:6b:89"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="taped766c0f-2a"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </interface><serial type="pty">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/console.log" append="off"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target type="isa-serial" port="0">
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <model name="isa-serial"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </target>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <console type="pty">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/console.log" append="off"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target type="serial" port="0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </console>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="usb" bus="0" port="1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </input>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <input type="mouse" bus="ps2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <listen type="address" address="::"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <video>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model type="virtio" heads="1" primary="yes"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </video>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]: </domain>
Feb 19 19:38:04 compute-0 nova_compute[186662]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.725 186666 DEBUG nova.virt.libvirt.migration [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] _update_pci_xml output xml=<domain type="kvm">
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <name>instance-00000011</name>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <uuid>31cc0f13-bb90-49cf-b9a0-e018920aabeb</uuid>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-185556420</nova:name>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:37:31</nova:creationTime>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:38:04 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:38:04 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:user uuid="08866af24c3a4551a81c8099dc9049fb">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin</nova:user>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:project uuid="673c40a9f52d4914b8ae3fc458b05edf">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347</nova:project>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <nova:port uuid="ed766c0f-2a6a-47e2-b4a2-456e041040d2">
Feb 19 19:38:04 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <memory unit="KiB">131072</memory>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <currentMemory unit="KiB">131072</currentMemory>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <vcpu placement="static">1</vcpu>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <resource>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <partition>/machine</partition>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </resource>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <system>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="serial">31cc0f13-bb90-49cf-b9a0-e018920aabeb</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="uuid">31cc0f13-bb90-49cf-b9a0-e018920aabeb</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </system>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <os>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </os>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <features>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <vmcoreinfo state="on"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </features>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact" check="partial">
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <model fallback="allow">Nehalem</model>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <on_poweroff>destroy</on_poweroff>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <on_reboot>restart</on_reboot>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <on_crash>destroy</on_crash>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/disk.config"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <readonly/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="0" model="pcie-root"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="1" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="1" port="0x10"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="2" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="2" port="0x11"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="3" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="3" port="0x12"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="4" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="4" port="0x13"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="5" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="5" port="0x14"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="6" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="6" port="0x15"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="7" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="7" port="0x16"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="8" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="8" port="0x17"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="9" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="9" port="0x18"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="10" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="10" port="0x19"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="11" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="11" port="0x1a"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="12" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="12" port="0x1b"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="13" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="13" port="0x1c"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="14" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="14" port="0x1d"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="15" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="15" port="0x1e"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="16" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="16" port="0x1f"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="17" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="17" port="0x20"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="18" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="18" port="0x21"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="19" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="19" port="0x22"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="20" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="20" port="0x23"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="21" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="21" port="0x24"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="22" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="22" port="0x25"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="23" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="23" port="0x26"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="24" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="24" port="0x27"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="25" model="pcie-root-port">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target chassis="25" port="0x28"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model name="pcie-pci-bridge"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="usb" index="0" model="piix3-uhci">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <controller type="sata" index="0">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <interface type="ethernet"><mac address="fa:16:3e:04:6b:89"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="taped766c0f-2a"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </interface><serial type="pty">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/console.log" append="off"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target type="isa-serial" port="0">
Feb 19 19:38:04 compute-0 nova_compute[186662]:         <model name="isa-serial"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       </target>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <console type="pty">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb/console.log" append="off"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <target type="serial" port="0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </console>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="usb" bus="0" port="1"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </input>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <input type="mouse" bus="ps2"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <listen type="address" address="::"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <video>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <model type="virtio" heads="1" primary="yes"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </video>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:38:04 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:38:04 compute-0 nova_compute[186662]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Feb 19 19:38:04 compute-0 nova_compute[186662]: </domain>
Feb 19 19:38:04 compute-0 nova_compute[186662]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.726 186666 DEBUG nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.774 186666 WARNING neutronclient.v2_0.client [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.913 186666 DEBUG nova.network.neutron [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Updated VIF entry in instance network info cache for port ed766c0f-2a6a-47e2-b4a2-456e041040d2. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Feb 19 19:38:04 compute-0 nova_compute[186662]: 2026-02-19 19:38:04.913 186666 DEBUG nova.network.neutron [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Updating instance_info_cache with network_info: [{"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:38:05 compute-0 nova_compute[186662]: 2026-02-19 19:38:05.206 186666 DEBUG nova.virt.libvirt.migration [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Feb 19 19:38:05 compute-0 nova_compute[186662]: 2026-02-19 19:38:05.206 186666 INFO nova.virt.libvirt.migration [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 19 19:38:05 compute-0 ovn_controller[96653]: 2026-02-19T19:38:05Z|00146|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Feb 19 19:38:05 compute-0 nova_compute[186662]: 2026-02-19 19:38:05.420 186666 DEBUG oslo_concurrency.lockutils [req-fe31b7fd-f98a-41b8-8c74-3fda4f4e6f5b req-630fade9-4359-444a-b5a6-b9214f993790 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-31cc0f13-bb90-49cf-b9a0-e018920aabeb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:38:06 compute-0 kernel: taped766c0f-2a (unregistering): left promiscuous mode
Feb 19 19:38:06 compute-0 NetworkManager[56519]: <info>  [1771529886.0164] device (taped766c0f-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.017 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:06 compute-0 ovn_controller[96653]: 2026-02-19T19:38:06Z|00147|binding|INFO|Releasing lport ed766c0f-2a6a-47e2-b4a2-456e041040d2 from this chassis (sb_readonly=0)
Feb 19 19:38:06 compute-0 ovn_controller[96653]: 2026-02-19T19:38:06Z|00148|binding|INFO|Setting lport ed766c0f-2a6a-47e2-b4a2-456e041040d2 down in Southbound
Feb 19 19:38:06 compute-0 ovn_controller[96653]: 2026-02-19T19:38:06Z|00149|binding|INFO|Removing iface taped766c0f-2a ovn-installed in OVS
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.023 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.024 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.028 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:6b:89 10.100.0.12'], port_security=['fa:16:3e:04:6b:89 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'd8481919-b10e-4218-b697-835a5c48ac63'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '31cc0f13-bb90-49cf-b9a0-e018920aabeb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '673c40a9f52d4914b8ae3fc458b05edf', 'neutron:revision_number': '10', 'neutron:security_group_ids': '21b17461-96ed-47ff-b7be-d33698019865', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ccf569e3-981f-424a-aec6-af69ea494142, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=ed766c0f-2a6a-47e2-b4a2-456e041040d2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.029 105986 INFO neutron.agent.ovn.metadata.agent [-] Port ed766c0f-2a6a-47e2-b4a2-456e041040d2 in datapath a8815e8f-3945-4eb5-98d8-e31bbb7c874a unbound from our chassis
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.029 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.030 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a8815e8f-3945-4eb5-98d8-e31bbb7c874a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.031 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[9a3d7b55-94fd-40d8-9bc8-c8f2ddf903de]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.031 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a namespace which is not needed anymore
Feb 19 19:38:06 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Deactivated successfully.
Feb 19 19:38:06 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Consumed 13.061s CPU time.
Feb 19 19:38:06 compute-0 systemd-machined[156014]: Machine qemu-13-instance-00000011 terminated.
Feb 19 19:38:06 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[213375]: [NOTICE]   (213379) : haproxy version is 3.0.5-8e879a5
Feb 19 19:38:06 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[213375]: [NOTICE]   (213379) : path to executable is /usr/sbin/haproxy
Feb 19 19:38:06 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[213375]: [WARNING]  (213379) : Exiting Master process...
Feb 19 19:38:06 compute-0 podman[213539]: 2026-02-19 19:38:06.121773325 +0000 UTC m=+0.020875539 container kill 604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:38:06 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[213375]: [ALERT]    (213379) : Current worker (213381) exited with code 143 (Terminated)
Feb 19 19:38:06 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[213375]: [WARNING]  (213379) : All workers exited. Exiting... (0)
Feb 19 19:38:06 compute-0 systemd[1]: libpod-604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180.scope: Deactivated successfully.
Feb 19 19:38:06 compute-0 podman[213555]: 2026-02-19 19:38:06.155445794 +0000 UTC m=+0.020656134 container died 604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 19 19:38:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180-userdata-shm.mount: Deactivated successfully.
Feb 19 19:38:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-c62c4e64ddd1835091653438a595c64b7f5a54ab93b8733b98a05af02e1e97cb-merged.mount: Deactivated successfully.
Feb 19 19:38:06 compute-0 podman[213555]: 2026-02-19 19:38:06.20756065 +0000 UTC m=+0.072770980 container cleanup 604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:38:06 compute-0 systemd[1]: libpod-conmon-604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180.scope: Deactivated successfully.
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.217 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.220 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.227 186666 INFO nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 19 19:38:06 compute-0 podman[213556]: 2026-02-19 19:38:06.228605012 +0000 UTC m=+0.090569973 container remove 604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216)
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.232 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[4c72638d-acb6-4d40-8d02-86f903f0f143]: (4, ("Thu Feb 19 07:38:06 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a (604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180)\n604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180\nThu Feb 19 07:38:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a (604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180)\n604bbb662c7722b0ca23a6f27a4ff2eab79c8ea873258811f37020eb1a190180\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.233 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[1919e522-5877-4d6f-b0ff-ccadb9db7b96]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.234 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.234 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d37ed718-fe9f-4515-89f9-6b21aae1f384]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.235 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8815e8f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.236 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:06 compute-0 kernel: tapa8815e8f-30: left promiscuous mode
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.243 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.245 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[8a1a2d52-4309-4ba3-bb7d-0463b31433bf]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.247 186666 DEBUG nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.248 186666 DEBUG nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.248 186666 DEBUG nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.258 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d23f4dbf-ec3d-4c5c-aec3-d8d003d97a9a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.258 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[eb63dd69-5ce4-416d-8ac6-8ac402d8e929]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.268 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a978c23f-a095-4e60-8e95-d719e3bddfdb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412466, 'reachable_time': 43697, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213603, 'error': None, 'target': 'ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.270 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:38:06 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:06.270 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[22ea8c7b-dd9a-4fbb-b0de-544061888bbe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:38:06 compute-0 systemd[1]: run-netns-ovnmeta\x2da8815e8f\x2d3945\x2d4eb5\x2d98d8\x2de31bbb7c874a.mount: Deactivated successfully.
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.730 186666 DEBUG nova.virt.libvirt.guest [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '31cc0f13-bb90-49cf-b9a0-e018920aabeb' (instance-00000011) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.730 186666 INFO nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Migration operation has completed
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.730 186666 INFO nova.compute.manager [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] _post_live_migration() is started..
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.743 186666 WARNING neutronclient.v2_0.client [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.744 186666 WARNING neutronclient.v2_0.client [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.936 186666 DEBUG nova.compute.manager [req-4152f448-f68a-447e-8576-7c08c46a8905 req-3ba3a400-02dc-49e7-b709-0d2370ed5657 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-unplugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.936 186666 DEBUG oslo_concurrency.lockutils [req-4152f448-f68a-447e-8576-7c08c46a8905 req-3ba3a400-02dc-49e7-b709-0d2370ed5657 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.937 186666 DEBUG oslo_concurrency.lockutils [req-4152f448-f68a-447e-8576-7c08c46a8905 req-3ba3a400-02dc-49e7-b709-0d2370ed5657 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.937 186666 DEBUG oslo_concurrency.lockutils [req-4152f448-f68a-447e-8576-7c08c46a8905 req-3ba3a400-02dc-49e7-b709-0d2370ed5657 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.937 186666 DEBUG nova.compute.manager [req-4152f448-f68a-447e-8576-7c08c46a8905 req-3ba3a400-02dc-49e7-b709-0d2370ed5657 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] No waiting events found dispatching network-vif-unplugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:38:06 compute-0 nova_compute[186662]: 2026-02-19 19:38:06.938 186666 DEBUG nova.compute.manager [req-4152f448-f68a-447e-8576-7c08c46a8905 req-3ba3a400-02dc-49e7-b709-0d2370ed5657 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-unplugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.012 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.287 186666 DEBUG nova.network.neutron [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Activated binding for port ed766c0f-2a6a-47e2-b4a2-456e041040d2 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.288 186666 DEBUG nova.compute.manager [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.289 186666 DEBUG nova.virt.libvirt.vif [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:37:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-185556420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-185',id=17,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:37:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='673c40a9f52d4914b8ae3fc458b05edf',ramdisk_id='',reservation_id='r-2ie1ziua',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:37:45Z,user_data=None,user_id='08866af24c3a4551a81c8099dc9049fb',uuid=31cc0f13-bb90-49cf-b9a0-e018920aabeb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.289 186666 DEBUG nova.network.os_vif_util [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "address": "fa:16:3e:04:6b:89", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped766c0f-2a", "ovs_interfaceid": "ed766c0f-2a6a-47e2-b4a2-456e041040d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.290 186666 DEBUG nova.network.os_vif_util [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:6b:89,bridge_name='br-int',has_traffic_filtering=True,id=ed766c0f-2a6a-47e2-b4a2-456e041040d2,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped766c0f-2a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.291 186666 DEBUG os_vif [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:6b:89,bridge_name='br-int',has_traffic_filtering=True,id=ed766c0f-2a6a-47e2-b4a2-456e041040d2,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped766c0f-2a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.294 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.294 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped766c0f-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.296 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.297 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.298 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.299 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=39102e8b-e7d8-4480-a8cf-90f42674b4c2) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.301 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.304 186666 INFO os_vif [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:6b:89,bridge_name='br-int',has_traffic_filtering=True,id=ed766c0f-2a6a-47e2-b4a2-456e041040d2,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped766c0f-2a')
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.304 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.305 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.305 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.305 186666 DEBUG nova.compute.manager [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.306 186666 INFO nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Deleting instance files /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb_del
Feb 19 19:38:07 compute-0 nova_compute[186662]: 2026-02-19 19:38:07.307 186666 INFO nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Deletion of /var/lib/nova/instances/31cc0f13-bb90-49cf-b9a0-e018920aabeb_del complete
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.112 186666 DEBUG nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.113 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.114 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.114 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.115 186666 DEBUG nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] No waiting events found dispatching network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.115 186666 WARNING nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received unexpected event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 for instance with vm_state active and task_state migrating.
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.115 186666 DEBUG nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-unplugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.116 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.116 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.117 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.117 186666 DEBUG nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] No waiting events found dispatching network-vif-unplugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.117 186666 DEBUG nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-unplugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.118 186666 DEBUG nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-unplugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.118 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.119 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.119 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.119 186666 DEBUG nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] No waiting events found dispatching network-vif-unplugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.120 186666 DEBUG nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-unplugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.120 186666 DEBUG nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.120 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.121 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.121 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.122 186666 DEBUG nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] No waiting events found dispatching network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.122 186666 WARNING nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received unexpected event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 for instance with vm_state active and task_state migrating.
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.122 186666 DEBUG nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.122 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.123 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.123 186666 DEBUG oslo_concurrency.lockutils [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.123 186666 DEBUG nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] No waiting events found dispatching network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.123 186666 WARNING nova.compute.manager [req-b1281627-75cf-4b38-8b4d-d926c1c97e2b req-e8e3b47e-0067-477f-85cf-3f42e8eae2b3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Received unexpected event network-vif-plugged-ed766c0f-2a6a-47e2-b4a2-456e041040d2 for instance with vm_state active and task_state migrating.
Feb 19 19:38:09 compute-0 nova_compute[186662]: 2026-02-19 19:38:09.191 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:10 compute-0 sshd-session[213604]: Invalid user x from 96.78.175.42 port 34692
Feb 19 19:38:10 compute-0 sshd-session[213604]: Received disconnect from 96.78.175.42 port 34692:11: Bye Bye [preauth]
Feb 19 19:38:10 compute-0 sshd-session[213604]: Disconnected from invalid user x 96.78.175.42 port 34692 [preauth]
Feb 19 19:38:12 compute-0 nova_compute[186662]: 2026-02-19 19:38:12.300 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:14 compute-0 nova_compute[186662]: 2026-02-19 19:38:14.232 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:15 compute-0 podman[213609]: 2026-02-19 19:38:15.28556587 +0000 UTC m=+0.064278554 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:38:15 compute-0 nova_compute[186662]: 2026-02-19 19:38:15.848 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:15 compute-0 nova_compute[186662]: 2026-02-19 19:38:15.848 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:15 compute-0 nova_compute[186662]: 2026-02-19 19:38:15.848 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "31cc0f13-bb90-49cf-b9a0-e018920aabeb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:15 compute-0 sshd-session[213607]: Received disconnect from 106.51.64.128 port 57031:11: Bye Bye [preauth]
Feb 19 19:38:15 compute-0 sshd-session[213607]: Disconnected from authenticating user root 106.51.64.128 port 57031 [preauth]
Feb 19 19:38:16 compute-0 nova_compute[186662]: 2026-02-19 19:38:16.363 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:16 compute-0 nova_compute[186662]: 2026-02-19 19:38:16.364 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:16 compute-0 nova_compute[186662]: 2026-02-19 19:38:16.364 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:16 compute-0 nova_compute[186662]: 2026-02-19 19:38:16.364 186666 DEBUG nova.compute.resource_tracker [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:38:16 compute-0 nova_compute[186662]: 2026-02-19 19:38:16.560 186666 WARNING nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:38:16 compute-0 nova_compute[186662]: 2026-02-19 19:38:16.562 186666 DEBUG oslo_concurrency.processutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:38:16 compute-0 nova_compute[186662]: 2026-02-19 19:38:16.585 186666 DEBUG oslo_concurrency.processutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:38:16 compute-0 nova_compute[186662]: 2026-02-19 19:38:16.586 186666 DEBUG nova.compute.resource_tracker [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5773MB free_disk=72.9761734008789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:38:16 compute-0 nova_compute[186662]: 2026-02-19 19:38:16.587 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:16 compute-0 nova_compute[186662]: 2026-02-19 19:38:16.587 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:17 compute-0 nova_compute[186662]: 2026-02-19 19:38:17.303 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:17 compute-0 nova_compute[186662]: 2026-02-19 19:38:17.611 186666 DEBUG nova.compute.resource_tracker [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Migration for instance 31cc0f13-bb90-49cf-b9a0-e018920aabeb refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Feb 19 19:38:18 compute-0 nova_compute[186662]: 2026-02-19 19:38:18.118 186666 DEBUG nova.compute.resource_tracker [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Feb 19 19:38:18 compute-0 nova_compute[186662]: 2026-02-19 19:38:18.193 186666 DEBUG nova.compute.resource_tracker [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Migration 50b28c67-6a2e-41ca-9408-9a0b465f5e60 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Feb 19 19:38:18 compute-0 nova_compute[186662]: 2026-02-19 19:38:18.193 186666 DEBUG nova.compute.resource_tracker [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:38:18 compute-0 nova_compute[186662]: 2026-02-19 19:38:18.194 186666 DEBUG nova.compute.resource_tracker [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:38:16 up  1:09,  0 user,  load average: 0.25, 0.31, 0.34\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:38:18 compute-0 nova_compute[186662]: 2026-02-19 19:38:18.222 186666 DEBUG nova.compute.provider_tree [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:38:18 compute-0 nova_compute[186662]: 2026-02-19 19:38:18.728 186666 DEBUG nova.scheduler.client.report [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:38:19 compute-0 nova_compute[186662]: 2026-02-19 19:38:19.234 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:19 compute-0 nova_compute[186662]: 2026-02-19 19:38:19.235 186666 DEBUG nova.compute.resource_tracker [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:38:19 compute-0 nova_compute[186662]: 2026-02-19 19:38:19.235 186666 DEBUG oslo_concurrency.lockutils [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.648s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:19 compute-0 nova_compute[186662]: 2026-02-19 19:38:19.250 186666 INFO nova.compute.manager [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 19 19:38:19 compute-0 nova_compute[186662]: 2026-02-19 19:38:19.347 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:38:19 compute-0 nova_compute[186662]: 2026-02-19 19:38:19.347 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:38:19 compute-0 nova_compute[186662]: 2026-02-19 19:38:19.347 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:38:20 compute-0 podman[213630]: 2026-02-19 19:38:20.292195273 +0000 UTC m=+0.064637512 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, version=9.7, distribution-scope=public, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=openstack_network_exporter, release=1770267347, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 19 19:38:20 compute-0 nova_compute[186662]: 2026-02-19 19:38:20.309 186666 INFO nova.scheduler.client.report [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Deleted allocation for migration 50b28c67-6a2e-41ca-9408-9a0b465f5e60
Feb 19 19:38:20 compute-0 nova_compute[186662]: 2026-02-19 19:38:20.309 186666 DEBUG nova.virt.libvirt.driver [None req-e0363a4f-8d69-4963-98cb-90d5a10b05b0 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 31cc0f13-bb90-49cf-b9a0-e018920aabeb] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Feb 19 19:38:20 compute-0 nova_compute[186662]: 2026-02-19 19:38:20.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:38:20 compute-0 nova_compute[186662]: 2026-02-19 19:38:20.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:38:20 compute-0 nova_compute[186662]: 2026-02-19 19:38:20.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:38:22 compute-0 nova_compute[186662]: 2026-02-19 19:38:22.305 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:22 compute-0 podman[213651]: 2026-02-19 19:38:22.32754601 +0000 UTC m=+0.100606208 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ovn_controller)
Feb 19 19:38:22 compute-0 nova_compute[186662]: 2026-02-19 19:38:22.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:38:23 compute-0 nova_compute[186662]: 2026-02-19 19:38:23.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:38:23 compute-0 nova_compute[186662]: 2026-02-19 19:38:23.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:38:24 compute-0 sshd-session[213677]: Invalid user brandon from 182.75.216.74 port 7276
Feb 19 19:38:24 compute-0 nova_compute[186662]: 2026-02-19 19:38:24.084 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:24 compute-0 nova_compute[186662]: 2026-02-19 19:38:24.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:24 compute-0 nova_compute[186662]: 2026-02-19 19:38:24.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:24 compute-0 nova_compute[186662]: 2026-02-19 19:38:24.085 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:38:24 compute-0 nova_compute[186662]: 2026-02-19 19:38:24.236 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:24 compute-0 nova_compute[186662]: 2026-02-19 19:38:24.254 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:38:24 compute-0 nova_compute[186662]: 2026-02-19 19:38:24.255 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:38:24 compute-0 nova_compute[186662]: 2026-02-19 19:38:24.268 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:38:24 compute-0 nova_compute[186662]: 2026-02-19 19:38:24.269 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5777MB free_disk=72.97615432739258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:38:24 compute-0 nova_compute[186662]: 2026-02-19 19:38:24.269 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:24 compute-0 nova_compute[186662]: 2026-02-19 19:38:24.269 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:24 compute-0 sshd-session[213677]: Received disconnect from 182.75.216.74 port 7276:11: Bye Bye [preauth]
Feb 19 19:38:24 compute-0 sshd-session[213677]: Disconnected from invalid user brandon 182.75.216.74 port 7276 [preauth]
Feb 19 19:38:25 compute-0 nova_compute[186662]: 2026-02-19 19:38:25.336 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:38:25 compute-0 nova_compute[186662]: 2026-02-19 19:38:25.337 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:38:24 up  1:09,  0 user,  load average: 0.23, 0.31, 0.34\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:38:25 compute-0 nova_compute[186662]: 2026-02-19 19:38:25.365 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:38:25 compute-0 nova_compute[186662]: 2026-02-19 19:38:25.654 186666 DEBUG nova.compute.manager [None req-db8f3ae3-d4f1-4c9c-ad78-444033a59682 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Feb 19 19:38:25 compute-0 nova_compute[186662]: 2026-02-19 19:38:25.697 186666 DEBUG nova.compute.provider_tree [None req-db8f3ae3-d4f1-4c9c-ad78-444033a59682 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Updating resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 generation from 21 to 24 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Feb 19 19:38:25 compute-0 nova_compute[186662]: 2026-02-19 19:38:25.874 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:38:25 compute-0 nova_compute[186662]: 2026-02-19 19:38:25.915 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 generation from 24 to 25 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Feb 19 19:38:26 compute-0 podman[213680]: 2026-02-19 19:38:26.292786985 +0000 UTC m=+0.063341121 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:38:26 compute-0 nova_compute[186662]: 2026-02-19 19:38:26.422 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:38:26 compute-0 nova_compute[186662]: 2026-02-19 19:38:26.422 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.153s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:27 compute-0 nova_compute[186662]: 2026-02-19 19:38:27.307 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:29 compute-0 nova_compute[186662]: 2026-02-19 19:38:29.264 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:29 compute-0 podman[196025]: time="2026-02-19T19:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:38:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:38:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Feb 19 19:38:29 compute-0 nova_compute[186662]: 2026-02-19 19:38:29.921 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:29.920 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:38:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:29.921 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:38:31 compute-0 openstack_network_exporter[198916]: ERROR   19:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:38:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:38:31 compute-0 openstack_network_exporter[198916]: ERROR   19:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:38:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:38:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:32.140 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:38:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:32.140 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:38:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:32.140 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:38:32 compute-0 nova_compute[186662]: 2026-02-19 19:38:32.309 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:34 compute-0 nova_compute[186662]: 2026-02-19 19:38:34.265 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:37 compute-0 nova_compute[186662]: 2026-02-19 19:38:37.350 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:38:37.923 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:38:39 compute-0 nova_compute[186662]: 2026-02-19 19:38:39.266 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:42 compute-0 nova_compute[186662]: 2026-02-19 19:38:42.352 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:44 compute-0 nova_compute[186662]: 2026-02-19 19:38:44.268 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:46 compute-0 podman[213707]: 2026-02-19 19:38:46.313830159 +0000 UTC m=+0.081677297 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 19 19:38:47 compute-0 nova_compute[186662]: 2026-02-19 19:38:47.355 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:49 compute-0 nova_compute[186662]: 2026-02-19 19:38:49.270 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:51 compute-0 podman[213727]: 2026-02-19 19:38:51.285184024 +0000 UTC m=+0.061957916 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.7)
Feb 19 19:38:52 compute-0 nova_compute[186662]: 2026-02-19 19:38:52.357 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:53 compute-0 podman[213749]: 2026-02-19 19:38:53.272926076 +0000 UTC m=+0.055245974 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Feb 19 19:38:54 compute-0 nova_compute[186662]: 2026-02-19 19:38:54.273 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:57 compute-0 podman[213775]: 2026-02-19 19:38:57.278423696 +0000 UTC m=+0.052211349 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:38:57 compute-0 nova_compute[186662]: 2026-02-19 19:38:57.359 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:59 compute-0 nova_compute[186662]: 2026-02-19 19:38:59.275 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:38:59 compute-0 podman[196025]: time="2026-02-19T19:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:38:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:38:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Feb 19 19:39:01 compute-0 openstack_network_exporter[198916]: ERROR   19:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:39:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:39:01 compute-0 openstack_network_exporter[198916]: ERROR   19:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:39:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:39:02 compute-0 nova_compute[186662]: 2026-02-19 19:39:02.361 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:02 compute-0 sshd-session[213801]: Received disconnect from 103.67.78.251 port 45080:11: Bye Bye [preauth]
Feb 19 19:39:02 compute-0 sshd-session[213801]: Disconnected from authenticating user root 103.67.78.251 port 45080 [preauth]
Feb 19 19:39:04 compute-0 nova_compute[186662]: 2026-02-19 19:39:04.277 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:07 compute-0 nova_compute[186662]: 2026-02-19 19:39:07.363 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:09 compute-0 nova_compute[186662]: 2026-02-19 19:39:09.278 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:12 compute-0 nova_compute[186662]: 2026-02-19 19:39:12.364 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:14 compute-0 nova_compute[186662]: 2026-02-19 19:39:14.280 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:17 compute-0 podman[213803]: 2026-02-19 19:39:17.280977568 +0000 UTC m=+0.051465302 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 19:39:17 compute-0 nova_compute[186662]: 2026-02-19 19:39:17.366 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:19 compute-0 nova_compute[186662]: 2026-02-19 19:39:19.281 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:20 compute-0 sshd-session[213822]: Invalid user mohammad from 189.165.79.177 port 47912
Feb 19 19:39:20 compute-0 nova_compute[186662]: 2026-02-19 19:39:20.423 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:39:20 compute-0 nova_compute[186662]: 2026-02-19 19:39:20.423 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:39:20 compute-0 nova_compute[186662]: 2026-02-19 19:39:20.423 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:39:20 compute-0 sshd-session[213822]: Received disconnect from 189.165.79.177 port 47912:11: Bye Bye [preauth]
Feb 19 19:39:20 compute-0 sshd-session[213822]: Disconnected from invalid user mohammad 189.165.79.177 port 47912 [preauth]
Feb 19 19:39:20 compute-0 nova_compute[186662]: 2026-02-19 19:39:20.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:39:20 compute-0 nova_compute[186662]: 2026-02-19 19:39:20.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:39:21 compute-0 nova_compute[186662]: 2026-02-19 19:39:21.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:39:22 compute-0 podman[213824]: 2026-02-19 19:39:22.268030154 +0000 UTC m=+0.043793745 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.7, distribution-scope=public, name=ubi9/ubi-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Feb 19 19:39:22 compute-0 nova_compute[186662]: 2026-02-19 19:39:22.368 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:24 compute-0 nova_compute[186662]: 2026-02-19 19:39:24.282 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:24 compute-0 podman[213846]: 2026-02-19 19:39:24.297459338 +0000 UTC m=+0.071650861 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 19 19:39:24 compute-0 nova_compute[186662]: 2026-02-19 19:39:24.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:39:25 compute-0 nova_compute[186662]: 2026-02-19 19:39:25.079 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:39:25 compute-0 nova_compute[186662]: 2026-02-19 19:39:25.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:39:25 compute-0 nova_compute[186662]: 2026-02-19 19:39:25.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:39:26 compute-0 nova_compute[186662]: 2026-02-19 19:39:26.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:39:26 compute-0 nova_compute[186662]: 2026-02-19 19:39:26.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:39:26 compute-0 nova_compute[186662]: 2026-02-19 19:39:26.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:39:26 compute-0 nova_compute[186662]: 2026-02-19 19:39:26.086 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:39:26 compute-0 nova_compute[186662]: 2026-02-19 19:39:26.222 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:39:26 compute-0 nova_compute[186662]: 2026-02-19 19:39:26.223 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:39:26 compute-0 nova_compute[186662]: 2026-02-19 19:39:26.234 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:39:26 compute-0 nova_compute[186662]: 2026-02-19 19:39:26.235 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5811MB free_disk=72.97615814208984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:39:26 compute-0 nova_compute[186662]: 2026-02-19 19:39:26.235 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:39:26 compute-0 nova_compute[186662]: 2026-02-19 19:39:26.236 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:39:27 compute-0 nova_compute[186662]: 2026-02-19 19:39:27.279 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:39:27 compute-0 nova_compute[186662]: 2026-02-19 19:39:27.279 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:39:26 up  1:10,  0 user,  load average: 0.25, 0.31, 0.34\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:39:27 compute-0 nova_compute[186662]: 2026-02-19 19:39:27.299 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing inventories for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Feb 19 19:39:27 compute-0 nova_compute[186662]: 2026-02-19 19:39:27.312 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating ProviderTree inventory for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Feb 19 19:39:27 compute-0 nova_compute[186662]: 2026-02-19 19:39:27.312 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:39:27 compute-0 nova_compute[186662]: 2026-02-19 19:39:27.326 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing aggregate associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Feb 19 19:39:27 compute-0 nova_compute[186662]: 2026-02-19 19:39:27.343 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing trait associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STATUS_DISABLED,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_SSE2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,HW_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_USB _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Feb 19 19:39:27 compute-0 nova_compute[186662]: 2026-02-19 19:39:27.358 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:39:27 compute-0 nova_compute[186662]: 2026-02-19 19:39:27.370 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:27 compute-0 nova_compute[186662]: 2026-02-19 19:39:27.865 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:39:27 compute-0 nova_compute[186662]: 2026-02-19 19:39:27.923 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 generation from 25 to 26 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Feb 19 19:39:28 compute-0 podman[213873]: 2026-02-19 19:39:28.264168327 +0000 UTC m=+0.042168216 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:39:28 compute-0 nova_compute[186662]: 2026-02-19 19:39:28.434 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:39:28 compute-0 nova_compute[186662]: 2026-02-19 19:39:28.434 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.199s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:39:29 compute-0 nova_compute[186662]: 2026-02-19 19:39:29.285 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:29 compute-0 podman[196025]: time="2026-02-19T19:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:39:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:39:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Feb 19 19:39:31 compute-0 openstack_network_exporter[198916]: ERROR   19:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:39:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:39:31 compute-0 openstack_network_exporter[198916]: ERROR   19:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:39:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:39:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:32.141 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:39:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:32.141 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:39:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:32.141 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:39:32 compute-0 nova_compute[186662]: 2026-02-19 19:39:32.372 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:33 compute-0 ovn_controller[96653]: 2026-02-19T19:39:33Z|00150|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Feb 19 19:39:34 compute-0 nova_compute[186662]: 2026-02-19 19:39:34.288 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:37 compute-0 nova_compute[186662]: 2026-02-19 19:39:37.415 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:39 compute-0 nova_compute[186662]: 2026-02-19 19:39:39.290 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:41 compute-0 nova_compute[186662]: 2026-02-19 19:39:41.694 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:39:41 compute-0 nova_compute[186662]: 2026-02-19 19:39:41.694 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:39:42 compute-0 nova_compute[186662]: 2026-02-19 19:39:42.201 186666 DEBUG nova.compute.manager [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Feb 19 19:39:42 compute-0 nova_compute[186662]: 2026-02-19 19:39:42.417 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:42 compute-0 nova_compute[186662]: 2026-02-19 19:39:42.747 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:39:42 compute-0 nova_compute[186662]: 2026-02-19 19:39:42.747 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:39:42 compute-0 nova_compute[186662]: 2026-02-19 19:39:42.753 186666 DEBUG nova.virt.hardware [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:39:42 compute-0 nova_compute[186662]: 2026-02-19 19:39:42.753 186666 INFO nova.compute.claims [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:39:43 compute-0 nova_compute[186662]: 2026-02-19 19:39:43.815 186666 DEBUG nova.compute.provider_tree [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:39:44 compute-0 nova_compute[186662]: 2026-02-19 19:39:44.317 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:44 compute-0 nova_compute[186662]: 2026-02-19 19:39:44.323 186666 DEBUG nova.scheduler.client.report [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:39:44 compute-0 nova_compute[186662]: 2026-02-19 19:39:44.832 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.085s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:39:44 compute-0 nova_compute[186662]: 2026-02-19 19:39:44.833 186666 DEBUG nova.compute.manager [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Feb 19 19:39:45 compute-0 nova_compute[186662]: 2026-02-19 19:39:45.343 186666 DEBUG nova.compute.manager [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Feb 19 19:39:45 compute-0 nova_compute[186662]: 2026-02-19 19:39:45.343 186666 DEBUG nova.network.neutron [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Feb 19 19:39:45 compute-0 nova_compute[186662]: 2026-02-19 19:39:45.344 186666 WARNING neutronclient.v2_0.client [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:39:45 compute-0 nova_compute[186662]: 2026-02-19 19:39:45.344 186666 WARNING neutronclient.v2_0.client [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:39:45 compute-0 nova_compute[186662]: 2026-02-19 19:39:45.854 186666 INFO nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 19 19:39:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:46.033 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:39:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:46.035 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:39:46 compute-0 nova_compute[186662]: 2026-02-19 19:39:46.034 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:46 compute-0 nova_compute[186662]: 2026-02-19 19:39:46.234 186666 DEBUG nova.network.neutron [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Successfully created port: c498aef4-51c1-44b1-973a-67630fa942ba _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Feb 19 19:39:46 compute-0 nova_compute[186662]: 2026-02-19 19:39:46.363 186666 DEBUG nova.compute.manager [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Feb 19 19:39:47 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:47.036 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.051 186666 DEBUG nova.network.neutron [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Successfully updated port: c498aef4-51c1-44b1-973a-67630fa942ba _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.112 186666 DEBUG nova.compute.manager [req-461a2b24-fefe-4e6a-9756-eeeb38693c45 req-908bf9ec-acc0-4a58-b89b-04c5026fd32d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received event network-changed-c498aef4-51c1-44b1-973a-67630fa942ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.112 186666 DEBUG nova.compute.manager [req-461a2b24-fefe-4e6a-9756-eeeb38693c45 req-908bf9ec-acc0-4a58-b89b-04c5026fd32d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Refreshing instance network info cache due to event network-changed-c498aef4-51c1-44b1-973a-67630fa942ba. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.113 186666 DEBUG oslo_concurrency.lockutils [req-461a2b24-fefe-4e6a-9756-eeeb38693c45 req-908bf9ec-acc0-4a58-b89b-04c5026fd32d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-bf5ada51-bedd-4e83-8e0c-414b4560ffa9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.113 186666 DEBUG oslo_concurrency.lockutils [req-461a2b24-fefe-4e6a-9756-eeeb38693c45 req-908bf9ec-acc0-4a58-b89b-04c5026fd32d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-bf5ada51-bedd-4e83-8e0c-414b4560ffa9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.113 186666 DEBUG nova.network.neutron [req-461a2b24-fefe-4e6a-9756-eeeb38693c45 req-908bf9ec-acc0-4a58-b89b-04c5026fd32d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Refreshing network info cache for port c498aef4-51c1-44b1-973a-67630fa942ba _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.419 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.426 186666 DEBUG nova.compute.manager [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.428 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.429 186666 INFO nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Creating image(s)
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.430 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.430 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.431 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.432 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.439 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.442 186666 DEBUG oslo_concurrency.processutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.500 186666 DEBUG oslo_concurrency.processutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.501 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.502 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.502 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.506 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.506 186666 DEBUG oslo_concurrency.processutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.545 186666 DEBUG oslo_concurrency.processutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.545 186666 DEBUG oslo_concurrency.processutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.558 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "refresh_cache-bf5ada51-bedd-4e83-8e0c-414b4560ffa9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.603 186666 DEBUG oslo_concurrency.processutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk 1073741824" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.604 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.604 186666 DEBUG oslo_concurrency.processutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.618 186666 WARNING neutronclient.v2_0.client [req-461a2b24-fefe-4e6a-9756-eeeb38693c45 req-908bf9ec-acc0-4a58-b89b-04c5026fd32d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.671 186666 DEBUG oslo_concurrency.processutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.671 186666 DEBUG nova.virt.disk.api [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Checking if we can resize image /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.672 186666 DEBUG oslo_concurrency.processutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.723 186666 DEBUG oslo_concurrency.processutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.723 186666 DEBUG nova.virt.disk.api [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Cannot resize image /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.724 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.724 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Ensure instance console log exists: /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.724 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.724 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.725 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.810 186666 DEBUG nova.network.neutron [req-461a2b24-fefe-4e6a-9756-eeeb38693c45 req-908bf9ec-acc0-4a58-b89b-04c5026fd32d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:39:47 compute-0 nova_compute[186662]: 2026-02-19 19:39:47.939 186666 DEBUG nova.network.neutron [req-461a2b24-fefe-4e6a-9756-eeeb38693c45 req-908bf9ec-acc0-4a58-b89b-04c5026fd32d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:39:48 compute-0 podman[213917]: 2026-02-19 19:39:48.266372239 +0000 UTC m=+0.047807512 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:39:48 compute-0 nova_compute[186662]: 2026-02-19 19:39:48.494 186666 DEBUG oslo_concurrency.lockutils [req-461a2b24-fefe-4e6a-9756-eeeb38693c45 req-908bf9ec-acc0-4a58-b89b-04c5026fd32d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-bf5ada51-bedd-4e83-8e0c-414b4560ffa9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:39:48 compute-0 nova_compute[186662]: 2026-02-19 19:39:48.495 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquired lock "refresh_cache-bf5ada51-bedd-4e83-8e0c-414b4560ffa9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:39:48 compute-0 nova_compute[186662]: 2026-02-19 19:39:48.495 186666 DEBUG nova.network.neutron [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.075 186666 DEBUG nova.network.neutron [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.252 186666 WARNING neutronclient.v2_0.client [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.320 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.382 186666 DEBUG nova.network.neutron [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Updating instance_info_cache with network_info: [{"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.890 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Releasing lock "refresh_cache-bf5ada51-bedd-4e83-8e0c-414b4560ffa9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.891 186666 DEBUG nova.compute.manager [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Instance network_info: |[{"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.893 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Start _get_guest_xml network_info=[{"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.897 186666 WARNING nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.898 186666 DEBUG nova.virt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-545137208', uuid='bf5ada51-bedd-4e83-8e0c-414b4560ffa9'), owner=OwnerMeta(userid='08866af24c3a4551a81c8099dc9049fb', username='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin', projectid='673c40a9f52d4914b8ae3fc458b05edf', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771529989.8985934) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.904 186666 DEBUG nova.virt.libvirt.host [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.904 186666 DEBUG nova.virt.libvirt.host [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.907 186666 DEBUG nova.virt.libvirt.host [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.907 186666 DEBUG nova.virt.libvirt.host [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.908 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.908 186666 DEBUG nova.virt.hardware [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:33Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.909 186666 DEBUG nova.virt.hardware [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.909 186666 DEBUG nova.virt.hardware [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.909 186666 DEBUG nova.virt.hardware [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.909 186666 DEBUG nova.virt.hardware [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.909 186666 DEBUG nova.virt.hardware [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.909 186666 DEBUG nova.virt.hardware [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.910 186666 DEBUG nova.virt.hardware [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.910 186666 DEBUG nova.virt.hardware [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.910 186666 DEBUG nova.virt.hardware [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.910 186666 DEBUG nova.virt.hardware [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.913 186666 DEBUG nova.virt.libvirt.vif [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:39:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-545137208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-545',id=19,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='673c40a9f52d4914b8ae3fc458b05edf',ramdisk_id='',reservation_id='r-v40wvjoi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:39:46Z,user_data=None,user_id='08866af24c3a4551a81c8099dc9049fb',uuid=bf5ada51-bedd-4e83-8e0c-414b4560ffa9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.913 186666 DEBUG nova.network.os_vif_util [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Converting VIF {"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.913 186666 DEBUG nova.network.os_vif_util [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:d4:5c,bridge_name='br-int',has_traffic_filtering=True,id=c498aef4-51c1-44b1-973a-67630fa942ba,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc498aef4-51') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:39:49 compute-0 nova_compute[186662]: 2026-02-19 19:39:49.914 186666 DEBUG nova.objects.instance [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lazy-loading 'pci_devices' on Instance uuid bf5ada51-bedd-4e83-8e0c-414b4560ffa9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.421 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:39:50 compute-0 nova_compute[186662]:   <uuid>bf5ada51-bedd-4e83-8e0c-414b4560ffa9</uuid>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   <name>instance-00000013</name>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   <memory>131072</memory>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-545137208</nova:name>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:39:49</nova:creationTime>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:39:50 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:39:50 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:user uuid="08866af24c3a4551a81c8099dc9049fb">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin</nova:user>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:project uuid="673c40a9f52d4914b8ae3fc458b05edf">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347</nova:project>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         <nova:port uuid="c498aef4-51c1-44b1-973a-67630fa942ba">
Feb 19 19:39:50 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <system>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <entry name="serial">bf5ada51-bedd-4e83-8e0c-414b4560ffa9</entry>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <entry name="uuid">bf5ada51-bedd-4e83-8e0c-414b4560ffa9</entry>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     </system>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   <os>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   </os>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   <features>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   </features>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk.config"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:ac:d4:5c"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <target dev="tapc498aef4-51"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/console.log" append="off"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <video>
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     </video>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:39:50 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:39:50 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:39:50 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:39:50 compute-0 nova_compute[186662]: </domain>
Feb 19 19:39:50 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.422 186666 DEBUG nova.compute.manager [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Preparing to wait for external event network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.422 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Acquiring lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.422 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.423 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.423 186666 DEBUG nova.virt.libvirt.vif [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:39:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-545137208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-545',id=19,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='673c40a9f52d4914b8ae3fc458b05edf',ramdisk_id='',reservation_id='r-v40wvjoi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:39:46Z,user_data=None,user_id='08866af24c3a4551a81c8099dc9049fb',uuid=bf5ada51-bedd-4e83-8e0c-414b4560ffa9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.424 186666 DEBUG nova.network.os_vif_util [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Converting VIF {"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.424 186666 DEBUG nova.network.os_vif_util [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:d4:5c,bridge_name='br-int',has_traffic_filtering=True,id=c498aef4-51c1-44b1-973a-67630fa942ba,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc498aef4-51') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.425 186666 DEBUG os_vif [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:d4:5c,bridge_name='br-int',has_traffic_filtering=True,id=c498aef4-51c1-44b1-973a-67630fa942ba,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc498aef4-51') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.425 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.426 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.426 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.427 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.427 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd90129d0-0605-5e53-90aa-f424e0459b37', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.428 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.430 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.434 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.434 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc498aef4-51, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.434 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc498aef4-51, col_values=(('qos', UUID('b32a1ff8-10d8-4619-9d4f-861caa21ade8')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.434 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc498aef4-51, col_values=(('external_ids', {'iface-id': 'c498aef4-51c1-44b1-973a-67630fa942ba', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:d4:5c', 'vm-uuid': 'bf5ada51-bedd-4e83-8e0c-414b4560ffa9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.435 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.436 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:50 compute-0 NetworkManager[56519]: <info>  [1771529990.4372] manager: (tapc498aef4-51): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.437 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.440 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:50 compute-0 nova_compute[186662]: 2026-02-19 19:39:50.441 186666 INFO os_vif [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:d4:5c,bridge_name='br-int',has_traffic_filtering=True,id=c498aef4-51c1-44b1-973a-67630fa942ba,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc498aef4-51')
Feb 19 19:39:51 compute-0 nova_compute[186662]: 2026-02-19 19:39:51.971 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:39:51 compute-0 nova_compute[186662]: 2026-02-19 19:39:51.972 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:39:51 compute-0 nova_compute[186662]: 2026-02-19 19:39:51.973 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] No VIF found with MAC fa:16:3e:ac:d4:5c, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:39:51 compute-0 nova_compute[186662]: 2026-02-19 19:39:51.974 186666 INFO nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Using config drive
Feb 19 19:39:52 compute-0 nova_compute[186662]: 2026-02-19 19:39:52.485 186666 WARNING neutronclient.v2_0.client [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:39:52 compute-0 nova_compute[186662]: 2026-02-19 19:39:52.814 186666 INFO nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Creating config drive at /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk.config
Feb 19 19:39:52 compute-0 nova_compute[186662]: 2026-02-19 19:39:52.821 186666 DEBUG oslo_concurrency.processutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpms1zc39h execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:39:52 compute-0 nova_compute[186662]: 2026-02-19 19:39:52.948 186666 DEBUG oslo_concurrency.processutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpms1zc39h" returned: 0 in 0.127s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:39:53 compute-0 kernel: tapc498aef4-51: entered promiscuous mode
Feb 19 19:39:53 compute-0 ovn_controller[96653]: 2026-02-19T19:39:53Z|00151|binding|INFO|Claiming lport c498aef4-51c1-44b1-973a-67630fa942ba for this chassis.
Feb 19 19:39:53 compute-0 ovn_controller[96653]: 2026-02-19T19:39:53Z|00152|binding|INFO|c498aef4-51c1-44b1-973a-67630fa942ba: Claiming fa:16:3e:ac:d4:5c 10.100.0.9
Feb 19 19:39:53 compute-0 NetworkManager[56519]: <info>  [1771529993.0185] manager: (tapc498aef4-51): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.018 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:53 compute-0 ovn_controller[96653]: 2026-02-19T19:39:53Z|00153|binding|INFO|Setting lport c498aef4-51c1-44b1-973a-67630fa942ba up in Southbound
Feb 19 19:39:53 compute-0 ovn_controller[96653]: 2026-02-19T19:39:53Z|00154|binding|INFO|Setting lport c498aef4-51c1-44b1-973a-67630fa942ba ovn-installed in OVS
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.025 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:d4:5c 10.100.0.9'], port_security=['fa:16:3e:ac:d4:5c 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'bf5ada51-bedd-4e83-8e0c-414b4560ffa9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '673c40a9f52d4914b8ae3fc458b05edf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '21b17461-96ed-47ff-b7be-d33698019865', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ccf569e3-981f-424a-aec6-af69ea494142, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=c498aef4-51c1-44b1-973a-67630fa942ba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.025 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.026 105986 INFO neutron.agent.ovn.metadata.agent [-] Port c498aef4-51c1-44b1-973a-67630fa942ba in datapath a8815e8f-3945-4eb5-98d8-e31bbb7c874a bound to our chassis
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.028 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.028 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a8815e8f-3945-4eb5-98d8-e31bbb7c874a
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.034 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.039 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[38cd3b94-a949-43da-99f0-64f4b6481975]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.040 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa8815e8f-31 in ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.041 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa8815e8f-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.041 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[58e0cf5e-3623-4469-81a4-a6c1458fedbf]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.042 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[05a3a4af-c2cc-4060-8a86-75eeb9c59ec7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 systemd-machined[156014]: New machine qemu-14-instance-00000013.
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.056 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[ab8cfea6-1939-425d-9776-8d63e85ad26f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.062 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[9c3e75ef-cbdc-43b7-8dc1-bf6734f50d32]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000013.
Feb 19 19:39:53 compute-0 systemd-udevd[213968]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:39:53 compute-0 NetworkManager[56519]: <info>  [1771529993.0904] device (tapc498aef4-51): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:39:53 compute-0 NetworkManager[56519]: <info>  [1771529993.0911] device (tapc498aef4-51): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.091 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[a31ec0a9-db9c-4dde-a8b4-f807a14077e2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.095 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d852eba0-ce6e-490a-9e53-636e4bc112c7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 NetworkManager[56519]: <info>  [1771529993.0973] manager: (tapa8815e8f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/61)
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.120 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[a2eef8e3-e0be-4d7c-8718-a9d2829788cb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.123 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[b5adc68b-707e-4be5-bb9c-068cb1d02851]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 NetworkManager[56519]: <info>  [1771529993.1420] device (tapa8815e8f-30): carrier: link connected
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.145 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[f1b53b90-9bf2-4417-a690-80232c642513]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 podman[213948]: 2026-02-19 19:39:53.147069584 +0000 UTC m=+0.134190571 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.155 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[cf509963-1754-446e-b008-ad86ebbe2705]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa8815e8f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:32:22:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426270, 'reachable_time': 34190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214010, 'error': None, 'target': 'ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.161 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[deb3a495-b20d-4a96-a99c-8a54c8481cda]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe32:22a3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426270, 'tstamp': 426270}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214011, 'error': None, 'target': 'ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.169 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa32100-ccf9-4dce-9aa6-05cc74234210]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa8815e8f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:32:22:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426270, 'reachable_time': 34190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214012, 'error': None, 'target': 'ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.184 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[1b2ad07b-c4f7-41fd-b8c8-88bd9be04c2a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.208 186666 DEBUG nova.compute.manager [req-4a0b8d0d-b8c8-4192-82d3-b3a4c2986729 req-0d1ea9f1-1b86-4071-ab71-e4136edaf20a 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received event network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.209 186666 DEBUG oslo_concurrency.lockutils [req-4a0b8d0d-b8c8-4192-82d3-b3a4c2986729 req-0d1ea9f1-1b86-4071-ab71-e4136edaf20a 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.209 186666 DEBUG oslo_concurrency.lockutils [req-4a0b8d0d-b8c8-4192-82d3-b3a4c2986729 req-0d1ea9f1-1b86-4071-ab71-e4136edaf20a 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.209 186666 DEBUG oslo_concurrency.lockutils [req-4a0b8d0d-b8c8-4192-82d3-b3a4c2986729 req-0d1ea9f1-1b86-4071-ab71-e4136edaf20a 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.210 186666 DEBUG nova.compute.manager [req-4a0b8d0d-b8c8-4192-82d3-b3a4c2986729 req-0d1ea9f1-1b86-4071-ab71-e4136edaf20a 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Processing event network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.215 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[54f7505e-37bd-4147-af6f-d98e03e1c0b5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.215 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8815e8f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.216 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.216 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa8815e8f-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:39:53 compute-0 NetworkManager[56519]: <info>  [1771529993.2183] manager: (tapa8815e8f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Feb 19 19:39:53 compute-0 kernel: tapa8815e8f-30: entered promiscuous mode
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.219 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.221 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa8815e8f-30, col_values=(('external_ids', {'iface-id': '96538c20-f183-445c-bcae-80553c2c1fd2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:39:53 compute-0 ovn_controller[96653]: 2026-02-19T19:39:53Z|00155|binding|INFO|Releasing lport 96538c20-f183-445c-bcae-80553c2c1fd2 from this chassis (sb_readonly=0)
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.222 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.227 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.227 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[282bc14f-87be-4d29-a614-1782e6b5b5a8]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.228 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.228 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.228 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a8815e8f-3945-4eb5-98d8-e31bbb7c874a disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.228 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.228 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e3518943-9fb1-438c-a07a-e7a73e8c061e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.228 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.229 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b7fc506d-f28a-4456-bbec-5cf3053f806d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.229 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-a8815e8f-3945-4eb5-98d8-e31bbb7c874a
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID a8815e8f-3945-4eb5-98d8-e31bbb7c874a
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:39:53 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:39:53.229 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'env', 'PROCESS_TAG=haproxy-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.424 186666 DEBUG nova.compute.manager [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.428 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.433 186666 INFO nova.virt.libvirt.driver [-] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Instance spawned successfully.
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.433 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Feb 19 19:39:53 compute-0 podman[214049]: 2026-02-19 19:39:53.623836946 +0000 UTC m=+0.065922672 container create 805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:39:53 compute-0 systemd[1]: Started libpod-conmon-805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c.scope.
Feb 19 19:39:53 compute-0 podman[214049]: 2026-02-19 19:39:53.576334762 +0000 UTC m=+0.018420508 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:39:53 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:39:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3240631c70e28811cc12ab750b7a362e9893789a8787ca133aba4d12c88dd2a0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:39:53 compute-0 podman[214049]: 2026-02-19 19:39:53.698481589 +0000 UTC m=+0.140567315 container init 805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 19 19:39:53 compute-0 podman[214049]: 2026-02-19 19:39:53.702449146 +0000 UTC m=+0.144534872 container start 805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 19 19:39:53 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[214065]: [NOTICE]   (214069) : New worker (214071) forked
Feb 19 19:39:53 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[214065]: [NOTICE]   (214069) : Loading success.
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.953 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.954 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.955 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.956 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.957 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:39:53 compute-0 nova_compute[186662]: 2026-02-19 19:39:53.958 186666 DEBUG nova.virt.libvirt.driver [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:39:54 compute-0 nova_compute[186662]: 2026-02-19 19:39:54.319 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:54 compute-0 nova_compute[186662]: 2026-02-19 19:39:54.468 186666 INFO nova.compute.manager [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Took 7.04 seconds to spawn the instance on the hypervisor.
Feb 19 19:39:54 compute-0 nova_compute[186662]: 2026-02-19 19:39:54.468 186666 DEBUG nova.compute.manager [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:39:55 compute-0 nova_compute[186662]: 2026-02-19 19:39:55.256 186666 DEBUG nova.compute.manager [req-35f8ef08-57c0-4e85-8dd9-09de8983531e req-8124e07b-6422-4c5e-b484-cbc0899df713 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received event network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:39:55 compute-0 nova_compute[186662]: 2026-02-19 19:39:55.257 186666 DEBUG oslo_concurrency.lockutils [req-35f8ef08-57c0-4e85-8dd9-09de8983531e req-8124e07b-6422-4c5e-b484-cbc0899df713 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:39:55 compute-0 nova_compute[186662]: 2026-02-19 19:39:55.257 186666 DEBUG oslo_concurrency.lockutils [req-35f8ef08-57c0-4e85-8dd9-09de8983531e req-8124e07b-6422-4c5e-b484-cbc0899df713 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:39:55 compute-0 nova_compute[186662]: 2026-02-19 19:39:55.257 186666 DEBUG oslo_concurrency.lockutils [req-35f8ef08-57c0-4e85-8dd9-09de8983531e req-8124e07b-6422-4c5e-b484-cbc0899df713 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:39:55 compute-0 nova_compute[186662]: 2026-02-19 19:39:55.257 186666 DEBUG nova.compute.manager [req-35f8ef08-57c0-4e85-8dd9-09de8983531e req-8124e07b-6422-4c5e-b484-cbc0899df713 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] No waiting events found dispatching network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:39:55 compute-0 nova_compute[186662]: 2026-02-19 19:39:55.257 186666 WARNING nova.compute.manager [req-35f8ef08-57c0-4e85-8dd9-09de8983531e req-8124e07b-6422-4c5e-b484-cbc0899df713 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received unexpected event network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba for instance with vm_state active and task_state None.
Feb 19 19:39:55 compute-0 podman[214080]: 2026-02-19 19:39:55.307763257 +0000 UTC m=+0.077308109 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Feb 19 19:39:55 compute-0 nova_compute[186662]: 2026-02-19 19:39:55.403 186666 INFO nova.compute.manager [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Took 12.69 seconds to build instance.
Feb 19 19:39:55 compute-0 nova_compute[186662]: 2026-02-19 19:39:55.435 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:55 compute-0 nova_compute[186662]: 2026-02-19 19:39:55.908 186666 DEBUG oslo_concurrency.lockutils [None req-86f4f393-02fd-4fe7-b1ca-895478d88708 08866af24c3a4551a81c8099dc9049fb 673c40a9f52d4914b8ae3fc458b05edf - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.214s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:39:59 compute-0 podman[214106]: 2026-02-19 19:39:59.316528097 +0000 UTC m=+0.092672412 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:39:59 compute-0 nova_compute[186662]: 2026-02-19 19:39:59.359 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:39:59 compute-0 podman[196025]: time="2026-02-19T19:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:39:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17220 "" "Go-http-client/1.1"
Feb 19 19:39:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2655 "" "Go-http-client/1.1"
Feb 19 19:40:00 compute-0 nova_compute[186662]: 2026-02-19 19:40:00.492 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:01 compute-0 openstack_network_exporter[198916]: ERROR   19:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:40:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:40:01 compute-0 openstack_network_exporter[198916]: ERROR   19:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:40:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:40:04 compute-0 nova_compute[186662]: 2026-02-19 19:40:04.360 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:05 compute-0 nova_compute[186662]: 2026-02-19 19:40:05.495 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:05 compute-0 ovn_controller[96653]: 2026-02-19T19:40:05Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ac:d4:5c 10.100.0.9
Feb 19 19:40:05 compute-0 ovn_controller[96653]: 2026-02-19T19:40:05Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ac:d4:5c 10.100.0.9
Feb 19 19:40:06 compute-0 nova_compute[186662]: 2026-02-19 19:40:06.535 186666 DEBUG nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Check if temp file /var/lib/nova/instances/tmp4p6_oe_3 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Feb 19 19:40:06 compute-0 nova_compute[186662]: 2026-02-19 19:40:06.539 186666 DEBUG nova.compute.manager [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4p6_oe_3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bf5ada51-bedd-4e83-8e0c-414b4560ffa9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Feb 19 19:40:09 compute-0 nova_compute[186662]: 2026-02-19 19:40:09.362 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:10 compute-0 nova_compute[186662]: 2026-02-19 19:40:10.497 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:11 compute-0 nova_compute[186662]: 2026-02-19 19:40:11.029 186666 DEBUG oslo_concurrency.processutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:40:11 compute-0 nova_compute[186662]: 2026-02-19 19:40:11.094 186666 DEBUG oslo_concurrency.processutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:40:11 compute-0 nova_compute[186662]: 2026-02-19 19:40:11.096 186666 DEBUG oslo_concurrency.processutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:40:11 compute-0 nova_compute[186662]: 2026-02-19 19:40:11.161 186666 DEBUG oslo_concurrency.processutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:40:11 compute-0 nova_compute[186662]: 2026-02-19 19:40:11.162 186666 DEBUG nova.compute.manager [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Preparing to wait for external event network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:40:11 compute-0 nova_compute[186662]: 2026-02-19 19:40:11.162 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:11 compute-0 nova_compute[186662]: 2026-02-19 19:40:11.163 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:11 compute-0 nova_compute[186662]: 2026-02-19 19:40:11.163 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:14 compute-0 nova_compute[186662]: 2026-02-19 19:40:14.365 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:15 compute-0 nova_compute[186662]: 2026-02-19 19:40:15.500 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:16 compute-0 nova_compute[186662]: 2026-02-19 19:40:16.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:40:17 compute-0 nova_compute[186662]: 2026-02-19 19:40:17.084 186666 DEBUG nova.compute.manager [req-3de53d2a-0e36-4b4c-8b59-3bdd0b79d0b6 req-b47af9bd-5196-43a9-8a75-13e3b7bdfe41 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received event network-vif-unplugged-c498aef4-51c1-44b1-973a-67630fa942ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:40:17 compute-0 nova_compute[186662]: 2026-02-19 19:40:17.084 186666 DEBUG oslo_concurrency.lockutils [req-3de53d2a-0e36-4b4c-8b59-3bdd0b79d0b6 req-b47af9bd-5196-43a9-8a75-13e3b7bdfe41 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:17 compute-0 nova_compute[186662]: 2026-02-19 19:40:17.084 186666 DEBUG oslo_concurrency.lockutils [req-3de53d2a-0e36-4b4c-8b59-3bdd0b79d0b6 req-b47af9bd-5196-43a9-8a75-13e3b7bdfe41 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:17 compute-0 nova_compute[186662]: 2026-02-19 19:40:17.085 186666 DEBUG oslo_concurrency.lockutils [req-3de53d2a-0e36-4b4c-8b59-3bdd0b79d0b6 req-b47af9bd-5196-43a9-8a75-13e3b7bdfe41 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:17 compute-0 nova_compute[186662]: 2026-02-19 19:40:17.085 186666 DEBUG nova.compute.manager [req-3de53d2a-0e36-4b4c-8b59-3bdd0b79d0b6 req-b47af9bd-5196-43a9-8a75-13e3b7bdfe41 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] No event matching network-vif-unplugged-c498aef4-51c1-44b1-973a-67630fa942ba in dict_keys([('network-vif-plugged', 'c498aef4-51c1-44b1-973a-67630fa942ba')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Feb 19 19:40:17 compute-0 nova_compute[186662]: 2026-02-19 19:40:17.085 186666 DEBUG nova.compute.manager [req-3de53d2a-0e36-4b4c-8b59-3bdd0b79d0b6 req-b47af9bd-5196-43a9-8a75-13e3b7bdfe41 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received event network-vif-unplugged-c498aef4-51c1-44b1-973a-67630fa942ba for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:40:18 compute-0 nova_compute[186662]: 2026-02-19 19:40:18.085 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:40:18 compute-0 nova_compute[186662]: 2026-02-19 19:40:18.683 186666 INFO nova.compute.manager [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Took 7.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.152 186666 DEBUG nova.compute.manager [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received event network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.153 186666 DEBUG oslo_concurrency.lockutils [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.154 186666 DEBUG oslo_concurrency.lockutils [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.154 186666 DEBUG oslo_concurrency.lockutils [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.154 186666 DEBUG nova.compute.manager [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Processing event network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.155 186666 DEBUG nova.compute.manager [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received event network-changed-c498aef4-51c1-44b1-973a-67630fa942ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.156 186666 DEBUG nova.compute.manager [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Refreshing instance network info cache due to event network-changed-c498aef4-51c1-44b1-973a-67630fa942ba. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.156 186666 DEBUG oslo_concurrency.lockutils [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-bf5ada51-bedd-4e83-8e0c-414b4560ffa9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.156 186666 DEBUG oslo_concurrency.lockutils [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-bf5ada51-bedd-4e83-8e0c-414b4560ffa9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.157 186666 DEBUG nova.network.neutron [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Refreshing network info cache for port c498aef4-51c1-44b1-973a-67630fa942ba _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.159 186666 DEBUG nova.compute.manager [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:40:19 compute-0 podman[214156]: 2026-02-19 19:40:19.283246769 +0000 UTC m=+0.055736914 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.403 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.665 186666 WARNING neutronclient.v2_0.client [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:40:19 compute-0 nova_compute[186662]: 2026-02-19 19:40:19.668 186666 DEBUG nova.compute.manager [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4p6_oe_3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bf5ada51-bedd-4e83-8e0c-414b4560ffa9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(accac169-e1fe-470e-803b-c697abd346dd),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.128 186666 WARNING neutronclient.v2_0.client [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.181 186666 DEBUG nova.objects.instance [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid bf5ada51-bedd-4e83-8e0c-414b4560ffa9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.183 186666 DEBUG nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.184 186666 DEBUG nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.184 186666 DEBUG nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.315 186666 DEBUG nova.network.neutron [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Updated VIF entry in instance network info cache for port c498aef4-51c1-44b1-973a-67630fa942ba. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.316 186666 DEBUG nova.network.neutron [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Updating instance_info_cache with network_info: [{"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.538 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.686 186666 DEBUG nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.686 186666 DEBUG nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.693 186666 DEBUG nova.virt.libvirt.vif [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:39:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-545137208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-545',id=19,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:39:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='673c40a9f52d4914b8ae3fc458b05edf',ramdisk_id='',reservation_id='r-v40wvjoi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:39:54Z,user_data=None,user_id='08866af24c3a4551a81c8099dc9049fb',uuid=bf5ada51-bedd-4e83-8e0c-414b4560ffa9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.694 186666 DEBUG nova.network.os_vif_util [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.694 186666 DEBUG nova.network.os_vif_util [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:d4:5c,bridge_name='br-int',has_traffic_filtering=True,id=c498aef4-51c1-44b1-973a-67630fa942ba,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc498aef4-51') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.695 186666 DEBUG nova.virt.libvirt.migration [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Updating guest XML with vif config: <interface type="ethernet">
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <mac address="fa:16:3e:ac:d4:5c"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <model type="virtio"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <mtu size="1442"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <target dev="tapc498aef4-51"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]: </interface>
Feb 19 19:40:20 compute-0 nova_compute[186662]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.695 186666 DEBUG nova.virt.libvirt.migration [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <name>instance-00000013</name>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <uuid>bf5ada51-bedd-4e83-8e0c-414b4560ffa9</uuid>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-545137208</nova:name>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:39:49</nova:creationTime>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:40:20 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:40:20 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:user uuid="08866af24c3a4551a81c8099dc9049fb">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin</nova:user>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:project uuid="673c40a9f52d4914b8ae3fc458b05edf">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347</nova:project>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:port uuid="c498aef4-51c1-44b1-973a-67630fa942ba">
Feb 19 19:40:20 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <memory unit="KiB">131072</memory>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <currentMemory unit="KiB">131072</currentMemory>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <vcpu placement="static">1</vcpu>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <resource>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <partition>/machine</partition>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </resource>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <system>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="serial">bf5ada51-bedd-4e83-8e0c-414b4560ffa9</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="uuid">bf5ada51-bedd-4e83-8e0c-414b4560ffa9</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </system>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <os>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </os>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <features>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <vmcoreinfo state="on"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </features>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact" check="partial">
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <model fallback="allow">Nehalem</model>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <on_poweroff>destroy</on_poweroff>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <on_reboot>restart</on_reboot>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <on_crash>destroy</on_crash>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk.config"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <readonly/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="0" model="pcie-root"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="1" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="1" port="0x10"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="2" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="2" port="0x11"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="3" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="3" port="0x12"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="4" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="4" port="0x13"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="5" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="5" port="0x14"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="6" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="6" port="0x15"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="7" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="7" port="0x16"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="8" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="8" port="0x17"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="9" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="9" port="0x18"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="10" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="10" port="0x19"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="11" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="11" port="0x1a"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="12" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="12" port="0x1b"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="13" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="13" port="0x1c"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="14" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="14" port="0x1d"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="15" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="15" port="0x1e"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="16" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="16" port="0x1f"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="17" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="17" port="0x20"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="18" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="18" port="0x21"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="19" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="19" port="0x22"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="20" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="20" port="0x23"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="21" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="21" port="0x24"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="22" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="22" port="0x25"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="23" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="23" port="0x26"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="24" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="24" port="0x27"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="25" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="25" port="0x28"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-pci-bridge"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="usb" index="0" model="piix3-uhci">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="sata" index="0">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <interface type="ethernet"><mac address="fa:16:3e:ac:d4:5c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc498aef4-51"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </interface><serial type="pty">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/console.log" append="off"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target type="isa-serial" port="0">
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <model name="isa-serial"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </target>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <console type="pty">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/console.log" append="off"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target type="serial" port="0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </console>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="usb" bus="0" port="1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </input>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <input type="mouse" bus="ps2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <listen type="address" address="::"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <video>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model type="virtio" heads="1" primary="yes"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </video>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]: </domain>
Feb 19 19:40:20 compute-0 nova_compute[186662]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.697 186666 DEBUG nova.virt.libvirt.migration [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <name>instance-00000013</name>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <uuid>bf5ada51-bedd-4e83-8e0c-414b4560ffa9</uuid>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-545137208</nova:name>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:39:49</nova:creationTime>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:40:20 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:40:20 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:user uuid="08866af24c3a4551a81c8099dc9049fb">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin</nova:user>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:project uuid="673c40a9f52d4914b8ae3fc458b05edf">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347</nova:project>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:port uuid="c498aef4-51c1-44b1-973a-67630fa942ba">
Feb 19 19:40:20 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <memory unit="KiB">131072</memory>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <currentMemory unit="KiB">131072</currentMemory>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <vcpu placement="static">1</vcpu>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <resource>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <partition>/machine</partition>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </resource>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <system>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="serial">bf5ada51-bedd-4e83-8e0c-414b4560ffa9</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="uuid">bf5ada51-bedd-4e83-8e0c-414b4560ffa9</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </system>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <os>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </os>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <features>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <vmcoreinfo state="on"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </features>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact" check="partial">
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <model fallback="allow">Nehalem</model>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <on_poweroff>destroy</on_poweroff>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <on_reboot>restart</on_reboot>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <on_crash>destroy</on_crash>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk.config"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <readonly/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="0" model="pcie-root"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="1" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="1" port="0x10"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="2" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="2" port="0x11"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="3" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="3" port="0x12"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="4" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="4" port="0x13"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="5" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="5" port="0x14"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="6" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="6" port="0x15"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="7" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="7" port="0x16"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="8" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="8" port="0x17"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="9" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="9" port="0x18"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="10" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="10" port="0x19"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="11" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="11" port="0x1a"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="12" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="12" port="0x1b"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="13" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="13" port="0x1c"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="14" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="14" port="0x1d"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="15" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="15" port="0x1e"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="16" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="16" port="0x1f"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="17" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="17" port="0x20"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="18" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="18" port="0x21"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="19" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="19" port="0x22"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="20" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="20" port="0x23"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="21" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="21" port="0x24"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="22" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="22" port="0x25"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="23" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="23" port="0x26"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="24" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="24" port="0x27"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="25" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="25" port="0x28"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-pci-bridge"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="usb" index="0" model="piix3-uhci">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="sata" index="0">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <interface type="ethernet"><mac address="fa:16:3e:ac:d4:5c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc498aef4-51"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </interface><serial type="pty">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/console.log" append="off"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target type="isa-serial" port="0">
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <model name="isa-serial"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </target>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <console type="pty">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/console.log" append="off"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target type="serial" port="0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </console>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="usb" bus="0" port="1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </input>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <input type="mouse" bus="ps2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <listen type="address" address="::"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <video>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model type="virtio" heads="1" primary="yes"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </video>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]: </domain>
Feb 19 19:40:20 compute-0 nova_compute[186662]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.697 186666 DEBUG nova.virt.libvirt.migration [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] _update_pci_xml output xml=<domain type="kvm">
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <name>instance-00000013</name>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <uuid>bf5ada51-bedd-4e83-8e0c-414b4560ffa9</uuid>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-545137208</nova:name>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:39:49</nova:creationTime>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:40:20 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:40:20 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:user uuid="08866af24c3a4551a81c8099dc9049fb">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin</nova:user>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:project uuid="673c40a9f52d4914b8ae3fc458b05edf">tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347</nova:project>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <nova:port uuid="c498aef4-51c1-44b1-973a-67630fa942ba">
Feb 19 19:40:20 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <memory unit="KiB">131072</memory>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <currentMemory unit="KiB">131072</currentMemory>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <vcpu placement="static">1</vcpu>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <resource>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <partition>/machine</partition>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </resource>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <system>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="serial">bf5ada51-bedd-4e83-8e0c-414b4560ffa9</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="uuid">bf5ada51-bedd-4e83-8e0c-414b4560ffa9</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </system>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <os>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </os>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <features>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <vmcoreinfo state="on"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </features>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact" check="partial">
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <model fallback="allow">Nehalem</model>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <on_poweroff>destroy</on_poweroff>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <on_reboot>restart</on_reboot>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <on_crash>destroy</on_crash>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/disk.config"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <readonly/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="0" model="pcie-root"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="1" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="1" port="0x10"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="2" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="2" port="0x11"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="3" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="3" port="0x12"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="4" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="4" port="0x13"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="5" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="5" port="0x14"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="6" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="6" port="0x15"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="7" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="7" port="0x16"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="8" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="8" port="0x17"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="9" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="9" port="0x18"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="10" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="10" port="0x19"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="11" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="11" port="0x1a"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="12" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="12" port="0x1b"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="13" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="13" port="0x1c"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="14" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="14" port="0x1d"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="15" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="15" port="0x1e"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="16" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="16" port="0x1f"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="17" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="17" port="0x20"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="18" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="18" port="0x21"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="19" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="19" port="0x22"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="20" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="20" port="0x23"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="21" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="21" port="0x24"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="22" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="22" port="0x25"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="23" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="23" port="0x26"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="24" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="24" port="0x27"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="25" model="pcie-root-port">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target chassis="25" port="0x28"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model name="pcie-pci-bridge"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="usb" index="0" model="piix3-uhci">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <controller type="sata" index="0">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <interface type="ethernet"><mac address="fa:16:3e:ac:d4:5c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc498aef4-51"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </interface><serial type="pty">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/console.log" append="off"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target type="isa-serial" port="0">
Feb 19 19:40:20 compute-0 nova_compute[186662]:         <model name="isa-serial"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       </target>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <console type="pty">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9/console.log" append="off"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <target type="serial" port="0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </console>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="usb" bus="0" port="1"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </input>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <input type="mouse" bus="ps2"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <listen type="address" address="::"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <video>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <model type="virtio" heads="1" primary="yes"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </video>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:40:20 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:40:20 compute-0 nova_compute[186662]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Feb 19 19:40:20 compute-0 nova_compute[186662]: </domain>
Feb 19 19:40:20 compute-0 nova_compute[186662]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.698 186666 DEBUG nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Feb 19 19:40:20 compute-0 nova_compute[186662]: 2026-02-19 19:40:20.862 186666 DEBUG oslo_concurrency.lockutils [req-6b2c5d08-091d-404c-919b-0ed269be6c1c req-7441b687-dd1f-4a10-97a1-c93981f796a2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-bf5ada51-bedd-4e83-8e0c-414b4560ffa9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:40:21 compute-0 nova_compute[186662]: 2026-02-19 19:40:21.189 186666 DEBUG nova.virt.libvirt.migration [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Feb 19 19:40:21 compute-0 nova_compute[186662]: 2026-02-19 19:40:21.190 186666 INFO nova.virt.libvirt.migration [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.212 186666 INFO nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 19 19:40:22 compute-0 kernel: tapc498aef4-51 (unregistering): left promiscuous mode
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.244 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:22 compute-0 NetworkManager[56519]: <info>  [1771530022.2452] device (tapc498aef4-51): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.251 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:22 compute-0 ovn_controller[96653]: 2026-02-19T19:40:22Z|00156|binding|INFO|Releasing lport c498aef4-51c1-44b1-973a-67630fa942ba from this chassis (sb_readonly=0)
Feb 19 19:40:22 compute-0 ovn_controller[96653]: 2026-02-19T19:40:22Z|00157|binding|INFO|Setting lport c498aef4-51c1-44b1-973a-67630fa942ba down in Southbound
Feb 19 19:40:22 compute-0 ovn_controller[96653]: 2026-02-19T19:40:22Z|00158|binding|INFO|Removing iface tapc498aef4-51 ovn-installed in OVS
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.254 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.260 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.262 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:d4:5c 10.100.0.9'], port_security=['fa:16:3e:ac:d4:5c 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'd8481919-b10e-4218-b697-835a5c48ac63'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'bf5ada51-bedd-4e83-8e0c-414b4560ffa9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '673c40a9f52d4914b8ae3fc458b05edf', 'neutron:revision_number': '10', 'neutron:security_group_ids': '21b17461-96ed-47ff-b7be-d33698019865', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ccf569e3-981f-424a-aec6-af69ea494142, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=c498aef4-51c1-44b1-973a-67630fa942ba) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.263 105986 INFO neutron.agent.ovn.metadata.agent [-] Port c498aef4-51c1-44b1-973a-67630fa942ba in datapath a8815e8f-3945-4eb5-98d8-e31bbb7c874a unbound from our chassis
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.264 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a8815e8f-3945-4eb5-98d8-e31bbb7c874a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.265 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[35a9ae74-36b1-4c54-837b-ae0794d7f1bc]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.265 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a namespace which is not needed anymore
Feb 19 19:40:22 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Deactivated successfully.
Feb 19 19:40:22 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Consumed 12.576s CPU time.
Feb 19 19:40:22 compute-0 systemd-machined[156014]: Machine qemu-14-instance-00000013 terminated.
Feb 19 19:40:22 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[214065]: [NOTICE]   (214069) : haproxy version is 3.0.5-8e879a5
Feb 19 19:40:22 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[214065]: [NOTICE]   (214069) : path to executable is /usr/sbin/haproxy
Feb 19 19:40:22 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[214065]: [WARNING]  (214069) : Exiting Master process...
Feb 19 19:40:22 compute-0 podman[214205]: 2026-02-19 19:40:22.36090819 +0000 UTC m=+0.026967897 container kill 805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:40:22 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[214065]: [ALERT]    (214069) : Current worker (214071) exited with code 143 (Terminated)
Feb 19 19:40:22 compute-0 neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a[214065]: [WARNING]  (214069) : All workers exited. Exiting... (0)
Feb 19 19:40:22 compute-0 systemd[1]: libpod-805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c.scope: Deactivated successfully.
Feb 19 19:40:22 compute-0 podman[214220]: 2026-02-19 19:40:22.40374669 +0000 UTC m=+0.027111489 container died 805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:40:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c-userdata-shm.mount: Deactivated successfully.
Feb 19 19:40:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-3240631c70e28811cc12ab750b7a362e9893789a8787ca133aba4d12c88dd2a0-merged.mount: Deactivated successfully.
Feb 19 19:40:22 compute-0 podman[214220]: 2026-02-19 19:40:22.442936952 +0000 UTC m=+0.066301751 container cleanup 805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 19 19:40:22 compute-0 systemd[1]: libpod-conmon-805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c.scope: Deactivated successfully.
Feb 19 19:40:22 compute-0 podman[214222]: 2026-02-19 19:40:22.458167882 +0000 UTC m=+0.076740365 container remove 805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.463 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7cb9ae9a-940d-4e41-ae43-50923424933d]: (4, ("Thu Feb 19 07:40:22 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a (805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c)\n805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c\nThu Feb 19 07:40:22 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a (805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c)\n805079d773ee951ef603bf20486f5620ab9f2624a2ad637c3b3a36a85939148c\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.464 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7e1674d6-44ac-4cfc-8a5d-56096a797670]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.465 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a8815e8f-3945-4eb5-98d8-e31bbb7c874a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.465 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c6c1851e-e531-49db-8b03-e83cd7540da9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.466 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8815e8f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.467 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:22 compute-0 kernel: tapa8815e8f-30: left promiscuous mode
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.475 186666 DEBUG nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.475 186666 DEBUG nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.475 186666 DEBUG nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.476 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.477 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.479 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d309fb-1b3e-4318-a943-dc016263dde6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.493 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[47554629-b423-4a8a-9ed3-ddf241c7e674]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.493 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[8f052db0-c4fe-41d3-9f21-5472ed4e2a8e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.503 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[1c6e3d8c-46f2-442b-9ff7-caca6f9f1877]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426264, 'reachable_time': 15570, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214274, 'error': None, 'target': 'ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:40:22 compute-0 systemd[1]: run-netns-ovnmeta\x2da8815e8f\x2d3945\x2d4eb5\x2d98d8\x2de31bbb7c874a.mount: Deactivated successfully.
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.505 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a8815e8f-3945-4eb5-98d8-e31bbb7c874a deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:40:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:22.505 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[b66b076d-0421-4e43-b91f-7108dc5817e0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.715 186666 DEBUG nova.virt.libvirt.guest [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'bf5ada51-bedd-4e83-8e0c-414b4560ffa9' (instance-00000013) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.715 186666 INFO nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Migration operation has completed
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.716 186666 INFO nova.compute.manager [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] _post_live_migration() is started..
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.734 186666 WARNING neutronclient.v2_0.client [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.734 186666 WARNING neutronclient.v2_0.client [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.949 186666 DEBUG nova.compute.manager [req-d010ebc7-89a3-4035-a706-9dc425fd91a5 req-6d24668e-444b-4484-bb97-e89eb55031b1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received event network-vif-unplugged-c498aef4-51c1-44b1-973a-67630fa942ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.950 186666 DEBUG oslo_concurrency.lockutils [req-d010ebc7-89a3-4035-a706-9dc425fd91a5 req-6d24668e-444b-4484-bb97-e89eb55031b1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.950 186666 DEBUG oslo_concurrency.lockutils [req-d010ebc7-89a3-4035-a706-9dc425fd91a5 req-6d24668e-444b-4484-bb97-e89eb55031b1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.951 186666 DEBUG oslo_concurrency.lockutils [req-d010ebc7-89a3-4035-a706-9dc425fd91a5 req-6d24668e-444b-4484-bb97-e89eb55031b1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.951 186666 DEBUG nova.compute.manager [req-d010ebc7-89a3-4035-a706-9dc425fd91a5 req-6d24668e-444b-4484-bb97-e89eb55031b1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] No waiting events found dispatching network-vif-unplugged-c498aef4-51c1-44b1-973a-67630fa942ba pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:40:22 compute-0 nova_compute[186662]: 2026-02-19 19:40:22.952 186666 DEBUG nova.compute.manager [req-d010ebc7-89a3-4035-a706-9dc425fd91a5 req-6d24668e-444b-4484-bb97-e89eb55031b1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received event network-vif-unplugged-c498aef4-51c1-44b1-973a-67630fa942ba for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.281 186666 DEBUG nova.network.neutron [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Activated binding for port c498aef4-51c1-44b1-973a-67630fa942ba and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.281 186666 DEBUG nova.compute.manager [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.282 186666 DEBUG nova.virt.libvirt.vif [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:39:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-545137208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-545',id=19,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:39:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='673c40a9f52d4914b8ae3fc458b05edf',ramdisk_id='',reservation_id='r-v40wvjoi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-2075983347-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:40:02Z,user_data=None,user_id='08866af24c3a4551a81c8099dc9049fb',uuid=bf5ada51-bedd-4e83-8e0c-414b4560ffa9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.282 186666 DEBUG nova.network.os_vif_util [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "c498aef4-51c1-44b1-973a-67630fa942ba", "address": "fa:16:3e:ac:d4:5c", "network": {"id": "a8815e8f-3945-4eb5-98d8-e31bbb7c874a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-569331339-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a82810a912eb44f4bbe52dfbc8765740", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc498aef4-51", "ovs_interfaceid": "c498aef4-51c1-44b1-973a-67630fa942ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.283 186666 DEBUG nova.network.os_vif_util [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:d4:5c,bridge_name='br-int',has_traffic_filtering=True,id=c498aef4-51c1-44b1-973a-67630fa942ba,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc498aef4-51') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.283 186666 DEBUG os_vif [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:d4:5c,bridge_name='br-int',has_traffic_filtering=True,id=c498aef4-51c1-44b1-973a-67630fa942ba,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc498aef4-51') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.284 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.285 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc498aef4-51, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.286 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.287 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.288 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.288 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b32a1ff8-10d8-4619-9d4f-861caa21ade8) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.289 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.291 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.293 186666 INFO os_vif [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:d4:5c,bridge_name='br-int',has_traffic_filtering=True,id=c498aef4-51c1-44b1-973a-67630fa942ba,network=Network(a8815e8f-3945-4eb5-98d8-e31bbb7c874a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc498aef4-51')
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.293 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.294 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.295 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.295 186666 DEBUG nova.compute.manager [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.296 186666 INFO nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Deleting instance files /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9_del
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.296 186666 INFO nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Deletion of /var/lib/nova/instances/bf5ada51-bedd-4e83-8e0c-414b4560ffa9_del complete
Feb 19 19:40:23 compute-0 podman[214276]: 2026-02-19 19:40:23.313724338 +0000 UTC m=+0.087983309 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, version=9.7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git)
Feb 19 19:40:23 compute-0 nova_compute[186662]: 2026-02-19 19:40:23.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:40:24 compute-0 nova_compute[186662]: 2026-02-19 19:40:24.403 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:24 compute-0 nova_compute[186662]: 2026-02-19 19:40:24.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.027 186666 DEBUG nova.compute.manager [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received event network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.028 186666 DEBUG oslo_concurrency.lockutils [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.028 186666 DEBUG oslo_concurrency.lockutils [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.028 186666 DEBUG oslo_concurrency.lockutils [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.028 186666 DEBUG nova.compute.manager [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] No waiting events found dispatching network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.029 186666 WARNING nova.compute.manager [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received unexpected event network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba for instance with vm_state active and task_state migrating.
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.030 186666 DEBUG nova.compute.manager [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received event network-vif-unplugged-c498aef4-51c1-44b1-973a-67630fa942ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.030 186666 DEBUG oslo_concurrency.lockutils [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.030 186666 DEBUG oslo_concurrency.lockutils [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.031 186666 DEBUG oslo_concurrency.lockutils [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.031 186666 DEBUG nova.compute.manager [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] No waiting events found dispatching network-vif-unplugged-c498aef4-51c1-44b1-973a-67630fa942ba pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.031 186666 DEBUG nova.compute.manager [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received event network-vif-unplugged-c498aef4-51c1-44b1-973a-67630fa942ba for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.031 186666 DEBUG nova.compute.manager [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received event network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.031 186666 DEBUG oslo_concurrency.lockutils [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.031 186666 DEBUG oslo_concurrency.lockutils [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.031 186666 DEBUG oslo_concurrency.lockutils [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.032 186666 DEBUG nova.compute.manager [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] No waiting events found dispatching network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:40:25 compute-0 nova_compute[186662]: 2026-02-19 19:40:25.032 186666 WARNING nova.compute.manager [req-45d65d75-150f-4630-afde-8fd0f394cdb2 req-75e6f097-98fb-41b6-b795-71dd542ec9bc 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Received unexpected event network-vif-plugged-c498aef4-51c1-44b1-973a-67630fa942ba for instance with vm_state active and task_state migrating.
Feb 19 19:40:26 compute-0 podman[214300]: 2026-02-19 19:40:26.378413403 +0000 UTC m=+0.147658629 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Feb 19 19:40:26 compute-0 nova_compute[186662]: 2026-02-19 19:40:26.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:40:27 compute-0 nova_compute[186662]: 2026-02-19 19:40:27.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:27 compute-0 nova_compute[186662]: 2026-02-19 19:40:27.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:27 compute-0 nova_compute[186662]: 2026-02-19 19:40:27.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:27 compute-0 nova_compute[186662]: 2026-02-19 19:40:27.089 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:40:27 compute-0 nova_compute[186662]: 2026-02-19 19:40:27.230 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:40:27 compute-0 nova_compute[186662]: 2026-02-19 19:40:27.232 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:40:27 compute-0 nova_compute[186662]: 2026-02-19 19:40:27.254 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:40:27 compute-0 nova_compute[186662]: 2026-02-19 19:40:27.255 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5771MB free_disk=72.9760856628418GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:40:27 compute-0 nova_compute[186662]: 2026-02-19 19:40:27.255 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:27 compute-0 nova_compute[186662]: 2026-02-19 19:40:27.255 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:28 compute-0 nova_compute[186662]: 2026-02-19 19:40:28.271 186666 INFO nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Updating resource usage from migration accac169-e1fe-470e-803b-c697abd346dd
Feb 19 19:40:28 compute-0 nova_compute[186662]: 2026-02-19 19:40:28.291 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:28 compute-0 nova_compute[186662]: 2026-02-19 19:40:28.315 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Migration accac169-e1fe-470e-803b-c697abd346dd is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Feb 19 19:40:28 compute-0 nova_compute[186662]: 2026-02-19 19:40:28.315 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:40:28 compute-0 nova_compute[186662]: 2026-02-19 19:40:28.316 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:40:27 up  1:11,  0 user,  load average: 0.31, 0.31, 0.34\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_673c40a9f52d4914b8ae3fc458b05edf': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:40:28 compute-0 nova_compute[186662]: 2026-02-19 19:40:28.403 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:40:28 compute-0 nova_compute[186662]: 2026-02-19 19:40:28.912 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:40:29 compute-0 nova_compute[186662]: 2026-02-19 19:40:29.406 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:29 compute-0 nova_compute[186662]: 2026-02-19 19:40:29.422 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:40:29 compute-0 nova_compute[186662]: 2026-02-19 19:40:29.422 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.167s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:29 compute-0 nova_compute[186662]: 2026-02-19 19:40:29.423 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:40:29 compute-0 nova_compute[186662]: 2026-02-19 19:40:29.423 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Feb 19 19:40:29 compute-0 podman[196025]: time="2026-02-19T19:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:40:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:40:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2190 "" "Go-http-client/1.1"
Feb 19 19:40:29 compute-0 nova_compute[186662]: 2026-02-19 19:40:29.931 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Feb 19 19:40:29 compute-0 nova_compute[186662]: 2026-02-19 19:40:29.932 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:40:29 compute-0 nova_compute[186662]: 2026-02-19 19:40:29.932 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Feb 19 19:40:30 compute-0 podman[214328]: 2026-02-19 19:40:30.262701669 +0000 UTC m=+0.041590682 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:40:31 compute-0 openstack_network_exporter[198916]: ERROR   19:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:40:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:40:31 compute-0 openstack_network_exporter[198916]: ERROR   19:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:40:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:40:31 compute-0 nova_compute[186662]: 2026-02-19 19:40:31.444 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:40:31 compute-0 nova_compute[186662]: 2026-02-19 19:40:31.829 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:31 compute-0 nova_compute[186662]: 2026-02-19 19:40:31.830 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:31 compute-0 nova_compute[186662]: 2026-02-19 19:40:31.830 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "bf5ada51-bedd-4e83-8e0c-414b4560ffa9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:32.142 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:32.142 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:32.143 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:32 compute-0 nova_compute[186662]: 2026-02-19 19:40:32.343 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:32 compute-0 nova_compute[186662]: 2026-02-19 19:40:32.344 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:32 compute-0 nova_compute[186662]: 2026-02-19 19:40:32.344 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:32 compute-0 nova_compute[186662]: 2026-02-19 19:40:32.344 186666 DEBUG nova.compute.resource_tracker [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:40:32 compute-0 nova_compute[186662]: 2026-02-19 19:40:32.439 186666 WARNING nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:40:32 compute-0 nova_compute[186662]: 2026-02-19 19:40:32.440 186666 DEBUG oslo_concurrency.processutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:40:32 compute-0 nova_compute[186662]: 2026-02-19 19:40:32.451 186666 DEBUG oslo_concurrency.processutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.011s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:40:32 compute-0 nova_compute[186662]: 2026-02-19 19:40:32.452 186666 DEBUG nova.compute.resource_tracker [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5770MB free_disk=72.9760856628418GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:40:32 compute-0 nova_compute[186662]: 2026-02-19 19:40:32.452 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:40:32 compute-0 nova_compute[186662]: 2026-02-19 19:40:32.453 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:40:33 compute-0 nova_compute[186662]: 2026-02-19 19:40:33.339 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:33 compute-0 nova_compute[186662]: 2026-02-19 19:40:33.467 186666 DEBUG nova.compute.resource_tracker [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Migration for instance bf5ada51-bedd-4e83-8e0c-414b4560ffa9 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Feb 19 19:40:33 compute-0 nova_compute[186662]: 2026-02-19 19:40:33.977 186666 DEBUG nova.compute.resource_tracker [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Feb 19 19:40:34 compute-0 nova_compute[186662]: 2026-02-19 19:40:34.001 186666 DEBUG nova.compute.resource_tracker [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Migration accac169-e1fe-470e-803b-c697abd346dd is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Feb 19 19:40:34 compute-0 nova_compute[186662]: 2026-02-19 19:40:34.002 186666 DEBUG nova.compute.resource_tracker [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:40:34 compute-0 nova_compute[186662]: 2026-02-19 19:40:34.002 186666 DEBUG nova.compute.resource_tracker [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:40:32 up  1:11,  0 user,  load average: 0.29, 0.31, 0.34\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:40:34 compute-0 nova_compute[186662]: 2026-02-19 19:40:34.031 186666 DEBUG nova.compute.provider_tree [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:40:34 compute-0 nova_compute[186662]: 2026-02-19 19:40:34.453 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:34 compute-0 nova_compute[186662]: 2026-02-19 19:40:34.540 186666 DEBUG nova.scheduler.client.report [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:40:35 compute-0 nova_compute[186662]: 2026-02-19 19:40:35.047 186666 DEBUG nova.compute.resource_tracker [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:40:35 compute-0 nova_compute[186662]: 2026-02-19 19:40:35.047 186666 DEBUG oslo_concurrency.lockutils [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.595s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:40:35 compute-0 nova_compute[186662]: 2026-02-19 19:40:35.066 186666 INFO nova.compute.manager [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 19 19:40:36 compute-0 nova_compute[186662]: 2026-02-19 19:40:36.127 186666 INFO nova.scheduler.client.report [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Deleted allocation for migration accac169-e1fe-470e-803b-c697abd346dd
Feb 19 19:40:36 compute-0 nova_compute[186662]: 2026-02-19 19:40:36.128 186666 DEBUG nova.virt.libvirt.driver [None req-0fce3a6d-0ef2-40dd-a65d-2a4ec6039427 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: bf5ada51-bedd-4e83-8e0c-414b4560ffa9] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Feb 19 19:40:38 compute-0 nova_compute[186662]: 2026-02-19 19:40:38.343 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:39 compute-0 nova_compute[186662]: 2026-02-19 19:40:39.453 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:43 compute-0 nova_compute[186662]: 2026-02-19 19:40:43.346 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:44 compute-0 nova_compute[186662]: 2026-02-19 19:40:44.455 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:48 compute-0 nova_compute[186662]: 2026-02-19 19:40:48.413 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:48 compute-0 sshd-session[214355]: Invalid user mysql from 45.169.200.254 port 34520
Feb 19 19:40:48 compute-0 sshd-session[214355]: Received disconnect from 45.169.200.254 port 34520:11: Bye Bye [preauth]
Feb 19 19:40:48 compute-0 sshd-session[214355]: Disconnected from invalid user mysql 45.169.200.254 port 34520 [preauth]
Feb 19 19:40:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:48.904 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:40:48 compute-0 nova_compute[186662]: 2026-02-19 19:40:48.904 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:48.904 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:40:49 compute-0 nova_compute[186662]: 2026-02-19 19:40:49.512 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:50 compute-0 podman[214358]: 2026-02-19 19:40:50.281612777 +0000 UTC m=+0.058827600 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.43.0)
Feb 19 19:40:51 compute-0 nova_compute[186662]: 2026-02-19 19:40:51.836 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:53 compute-0 nova_compute[186662]: 2026-02-19 19:40:53.456 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:54 compute-0 podman[214378]: 2026-02-19 19:40:54.270428963 +0000 UTC m=+0.050022417 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, version=9.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=)
Feb 19 19:40:54 compute-0 nova_compute[186662]: 2026-02-19 19:40:54.558 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:57 compute-0 podman[214400]: 2026-02-19 19:40:57.289243224 +0000 UTC m=+0.070774741 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 19 19:40:57 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:40:57.906 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:40:58 compute-0 nova_compute[186662]: 2026-02-19 19:40:58.458 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:59 compute-0 nova_compute[186662]: 2026-02-19 19:40:59.560 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:40:59 compute-0 podman[196025]: time="2026-02-19T19:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:40:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:40:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Feb 19 19:41:01 compute-0 podman[214426]: 2026-02-19 19:41:01.281555713 +0000 UTC m=+0.061127185 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:41:01 compute-0 openstack_network_exporter[198916]: ERROR   19:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:41:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:41:01 compute-0 openstack_network_exporter[198916]: ERROR   19:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:41:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:41:03 compute-0 nova_compute[186662]: 2026-02-19 19:41:03.460 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:04 compute-0 nova_compute[186662]: 2026-02-19 19:41:04.560 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:07 compute-0 sshd-session[214451]: Invalid user n8n from 197.211.55.20 port 33340
Feb 19 19:41:07 compute-0 sshd-session[214451]: Received disconnect from 197.211.55.20 port 33340:11: Bye Bye [preauth]
Feb 19 19:41:07 compute-0 sshd-session[214451]: Disconnected from invalid user n8n 197.211.55.20 port 33340 [preauth]
Feb 19 19:41:07 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:41:07.926 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:df:1e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-68c034ae-92c4-4919-9175-8992eee924de', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-68c034ae-92c4-4919-9175-8992eee924de', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72833255ccf64eb28a4dd1a5d46b8625', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db5091ff-cd1b-42d9-b277-948d5e6e25d3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ff9c6a14-aac0-4622-8df4-fda6d4294002) old=Port_Binding(mac=['fa:16:3e:ae:df:1e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-68c034ae-92c4-4919-9175-8992eee924de', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-68c034ae-92c4-4919-9175-8992eee924de', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72833255ccf64eb28a4dd1a5d46b8625', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:41:07 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:41:07.927 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ff9c6a14-aac0-4622-8df4-fda6d4294002 in datapath 68c034ae-92c4-4919-9175-8992eee924de updated
Feb 19 19:41:07 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:41:07.927 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 68c034ae-92c4-4919-9175-8992eee924de, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:41:07 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:41:07.928 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7f0dfc87-c3eb-4e9a-9d75-c4cfa876cef1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:41:08 compute-0 nova_compute[186662]: 2026-02-19 19:41:08.462 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:09 compute-0 nova_compute[186662]: 2026-02-19 19:41:09.566 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:13 compute-0 nova_compute[186662]: 2026-02-19 19:41:13.464 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:14 compute-0 nova_compute[186662]: 2026-02-19 19:41:14.620 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:41:15.431 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:9c:73 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-5d670c56-7787-4a13-b93a-c3379716cd11', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d670c56-7787-4a13-b93a-c3379716cd11', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5cd356f35cb9438f878370c6e951113c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=59b9aef7-0bdf-4211-b186-9739b7ced827, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=89384fd5-46d3-4482-ac0b-74bcb61a7dd8) old=Port_Binding(mac=['fa:16:3e:ab:9c:73'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-5d670c56-7787-4a13-b93a-c3379716cd11', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d670c56-7787-4a13-b93a-c3379716cd11', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5cd356f35cb9438f878370c6e951113c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:41:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:41:15.432 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 89384fd5-46d3-4482-ac0b-74bcb61a7dd8 in datapath 5d670c56-7787-4a13-b93a-c3379716cd11 updated
Feb 19 19:41:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:41:15.432 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d670c56-7787-4a13-b93a-c3379716cd11, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:41:15 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:41:15.433 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[095d25c3-e164-4677-b7da-b832a257a7bd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:41:16 compute-0 sshd-session[214453]: Invalid user git from 96.78.175.42 port 47398
Feb 19 19:41:16 compute-0 sshd-session[214453]: Received disconnect from 96.78.175.42 port 47398:11: Bye Bye [preauth]
Feb 19 19:41:16 compute-0 sshd-session[214453]: Disconnected from invalid user git 96.78.175.42 port 47398 [preauth]
Feb 19 19:41:18 compute-0 nova_compute[186662]: 2026-02-19 19:41:18.512 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:19 compute-0 nova_compute[186662]: 2026-02-19 19:41:19.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:41:19 compute-0 nova_compute[186662]: 2026-02-19 19:41:19.665 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:20 compute-0 nova_compute[186662]: 2026-02-19 19:41:20.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:41:21 compute-0 podman[214455]: 2026-02-19 19:41:21.288456187 +0000 UTC m=+0.068240305 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:41:21 compute-0 nova_compute[186662]: 2026-02-19 19:41:21.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:41:23 compute-0 nova_compute[186662]: 2026-02-19 19:41:23.556 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:24 compute-0 nova_compute[186662]: 2026-02-19 19:41:24.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:41:24 compute-0 nova_compute[186662]: 2026-02-19 19:41:24.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:41:24 compute-0 nova_compute[186662]: 2026-02-19 19:41:24.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:41:24 compute-0 nova_compute[186662]: 2026-02-19 19:41:24.712 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:25 compute-0 podman[214474]: 2026-02-19 19:41:25.257534839 +0000 UTC m=+0.038789535 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vcs-type=git, distribution-scope=public, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9/ubi-minimal)
Feb 19 19:41:25 compute-0 nova_compute[186662]: 2026-02-19 19:41:25.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:41:26 compute-0 ovn_controller[96653]: 2026-02-19T19:41:26Z|00159|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Feb 19 19:41:26 compute-0 nova_compute[186662]: 2026-02-19 19:41:26.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:41:28 compute-0 podman[214495]: 2026-02-19 19:41:28.282919742 +0000 UTC m=+0.064614399 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 19 19:41:28 compute-0 nova_compute[186662]: 2026-02-19 19:41:28.558 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:28 compute-0 nova_compute[186662]: 2026-02-19 19:41:28.749 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:41:29 compute-0 nova_compute[186662]: 2026-02-19 19:41:29.266 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:41:29 compute-0 nova_compute[186662]: 2026-02-19 19:41:29.267 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:41:29 compute-0 nova_compute[186662]: 2026-02-19 19:41:29.267 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:41:29 compute-0 nova_compute[186662]: 2026-02-19 19:41:29.267 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:41:29 compute-0 nova_compute[186662]: 2026-02-19 19:41:29.378 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:41:29 compute-0 nova_compute[186662]: 2026-02-19 19:41:29.380 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:41:29 compute-0 nova_compute[186662]: 2026-02-19 19:41:29.401 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:41:29 compute-0 nova_compute[186662]: 2026-02-19 19:41:29.402 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5779MB free_disk=72.97608184814453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:41:29 compute-0 nova_compute[186662]: 2026-02-19 19:41:29.402 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:41:29 compute-0 nova_compute[186662]: 2026-02-19 19:41:29.402 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:41:29 compute-0 nova_compute[186662]: 2026-02-19 19:41:29.712 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:29 compute-0 podman[196025]: time="2026-02-19T19:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:41:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:41:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2192 "" "Go-http-client/1.1"
Feb 19 19:41:30 compute-0 nova_compute[186662]: 2026-02-19 19:41:30.517 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:41:30 compute-0 nova_compute[186662]: 2026-02-19 19:41:30.517 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:41:29 up  1:12,  0 user,  load average: 0.21, 0.28, 0.33\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:41:30 compute-0 nova_compute[186662]: 2026-02-19 19:41:30.612 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:41:31 compute-0 nova_compute[186662]: 2026-02-19 19:41:31.120 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:41:31 compute-0 openstack_network_exporter[198916]: ERROR   19:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:41:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:41:31 compute-0 openstack_network_exporter[198916]: ERROR   19:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:41:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:41:31 compute-0 nova_compute[186662]: 2026-02-19 19:41:31.630 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:41:31 compute-0 nova_compute[186662]: 2026-02-19 19:41:31.631 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.229s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:41:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:41:32.144 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:41:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:41:32.144 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:41:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:41:32.144 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:41:32 compute-0 podman[214521]: 2026-02-19 19:41:32.296768288 +0000 UTC m=+0.078831109 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:41:32 compute-0 nova_compute[186662]: 2026-02-19 19:41:32.457 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:41:33 compute-0 nova_compute[186662]: 2026-02-19 19:41:33.599 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:34 compute-0 nova_compute[186662]: 2026-02-19 19:41:34.759 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:38 compute-0 nova_compute[186662]: 2026-02-19 19:41:38.601 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:39 compute-0 nova_compute[186662]: 2026-02-19 19:41:39.761 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:40 compute-0 sshd-session[214546]: Invalid user manager from 106.51.64.128 port 14207
Feb 19 19:41:41 compute-0 sshd-session[214546]: Received disconnect from 106.51.64.128 port 14207:11: Bye Bye [preauth]
Feb 19 19:41:41 compute-0 sshd-session[214546]: Disconnected from invalid user manager 106.51.64.128 port 14207 [preauth]
Feb 19 19:41:43 compute-0 nova_compute[186662]: 2026-02-19 19:41:43.662 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:44 compute-0 nova_compute[186662]: 2026-02-19 19:41:44.772 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:48 compute-0 nova_compute[186662]: 2026-02-19 19:41:48.664 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:49 compute-0 nova_compute[186662]: 2026-02-19 19:41:49.774 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:50 compute-0 nova_compute[186662]: 2026-02-19 19:41:50.927 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:41:50 compute-0 nova_compute[186662]: 2026-02-19 19:41:50.927 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:41:51 compute-0 nova_compute[186662]: 2026-02-19 19:41:51.434 186666 DEBUG nova.compute.manager [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Feb 19 19:41:52 compute-0 nova_compute[186662]: 2026-02-19 19:41:52.000 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:41:52 compute-0 nova_compute[186662]: 2026-02-19 19:41:52.000 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:41:52 compute-0 nova_compute[186662]: 2026-02-19 19:41:52.004 186666 DEBUG nova.virt.hardware [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:41:52 compute-0 nova_compute[186662]: 2026-02-19 19:41:52.005 186666 INFO nova.compute.claims [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:41:52 compute-0 podman[214549]: 2026-02-19 19:41:52.27684016 +0000 UTC m=+0.053350471 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 19 19:41:53 compute-0 nova_compute[186662]: 2026-02-19 19:41:53.101 186666 DEBUG nova.compute.provider_tree [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:41:53 compute-0 nova_compute[186662]: 2026-02-19 19:41:53.617 186666 DEBUG nova.scheduler.client.report [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:41:53 compute-0 nova_compute[186662]: 2026-02-19 19:41:53.667 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:54 compute-0 nova_compute[186662]: 2026-02-19 19:41:54.168 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.168s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:41:54 compute-0 nova_compute[186662]: 2026-02-19 19:41:54.169 186666 DEBUG nova.compute.manager [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Feb 19 19:41:54 compute-0 nova_compute[186662]: 2026-02-19 19:41:54.711 186666 DEBUG nova.compute.manager [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Feb 19 19:41:54 compute-0 nova_compute[186662]: 2026-02-19 19:41:54.711 186666 DEBUG nova.network.neutron [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Feb 19 19:41:54 compute-0 nova_compute[186662]: 2026-02-19 19:41:54.712 186666 WARNING neutronclient.v2_0.client [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:41:54 compute-0 nova_compute[186662]: 2026-02-19 19:41:54.712 186666 WARNING neutronclient.v2_0.client [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:41:54 compute-0 nova_compute[186662]: 2026-02-19 19:41:54.775 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:55 compute-0 nova_compute[186662]: 2026-02-19 19:41:55.225 186666 INFO nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 19 19:41:55 compute-0 nova_compute[186662]: 2026-02-19 19:41:55.395 186666 DEBUG nova.network.neutron [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Successfully created port: 41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Feb 19 19:41:55 compute-0 nova_compute[186662]: 2026-02-19 19:41:55.739 186666 DEBUG nova.compute.manager [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.086 186666 DEBUG nova.network.neutron [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Successfully updated port: 41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.158 186666 DEBUG nova.compute.manager [req-a1e04675-a942-432c-baad-0c74b9d6571b req-49dd2f82-741f-4ab6-86da-2e311b418df6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-changed-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.158 186666 DEBUG nova.compute.manager [req-a1e04675-a942-432c-baad-0c74b9d6571b req-49dd2f82-741f-4ab6-86da-2e311b418df6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Refreshing instance network info cache due to event network-changed-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.158 186666 DEBUG oslo_concurrency.lockutils [req-a1e04675-a942-432c-baad-0c74b9d6571b req-49dd2f82-741f-4ab6-86da-2e311b418df6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-d28dc50a-3cd1-4dc1-98f1-384bc695e924" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.159 186666 DEBUG oslo_concurrency.lockutils [req-a1e04675-a942-432c-baad-0c74b9d6571b req-49dd2f82-741f-4ab6-86da-2e311b418df6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-d28dc50a-3cd1-4dc1-98f1-384bc695e924" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.159 186666 DEBUG nova.network.neutron [req-a1e04675-a942-432c-baad-0c74b9d6571b req-49dd2f82-741f-4ab6-86da-2e311b418df6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Refreshing network info cache for port 41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:41:56 compute-0 podman[214571]: 2026-02-19 19:41:56.268364697 +0000 UTC m=+0.043177859 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, release=1770267347, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.593 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Acquiring lock "refresh_cache-d28dc50a-3cd1-4dc1-98f1-384bc695e924" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.664 186666 WARNING neutronclient.v2_0.client [req-a1e04675-a942-432c-baad-0c74b9d6571b req-49dd2f82-741f-4ab6-86da-2e311b418df6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.755 186666 DEBUG nova.network.neutron [req-a1e04675-a942-432c-baad-0c74b9d6571b req-49dd2f82-741f-4ab6-86da-2e311b418df6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.758 186666 DEBUG nova.compute.manager [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.759 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.759 186666 INFO nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Creating image(s)
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.759 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Acquiring lock "/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.759 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Lock "/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.760 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Lock "/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.760 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.763 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.764 186666 DEBUG oslo_concurrency.processutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.840 186666 DEBUG oslo_concurrency.processutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.842 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.843 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.844 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.851 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.852 186666 DEBUG oslo_concurrency.processutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.907 186666 DEBUG oslo_concurrency.processutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.908 186666 DEBUG oslo_concurrency.processutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.923 186666 DEBUG nova.network.neutron [req-a1e04675-a942-432c-baad-0c74b9d6571b req-49dd2f82-741f-4ab6-86da-2e311b418df6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.950 186666 DEBUG oslo_concurrency.processutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.951 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:41:56 compute-0 nova_compute[186662]: 2026-02-19 19:41:56.951 186666 DEBUG oslo_concurrency.processutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:41:57 compute-0 nova_compute[186662]: 2026-02-19 19:41:57.024 186666 DEBUG oslo_concurrency.processutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:41:57 compute-0 nova_compute[186662]: 2026-02-19 19:41:57.025 186666 DEBUG nova.virt.disk.api [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Checking if we can resize image /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:41:57 compute-0 nova_compute[186662]: 2026-02-19 19:41:57.026 186666 DEBUG oslo_concurrency.processutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:41:57 compute-0 nova_compute[186662]: 2026-02-19 19:41:57.067 186666 DEBUG oslo_concurrency.processutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:41:57 compute-0 nova_compute[186662]: 2026-02-19 19:41:57.067 186666 DEBUG nova.virt.disk.api [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Cannot resize image /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:41:57 compute-0 nova_compute[186662]: 2026-02-19 19:41:57.068 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Feb 19 19:41:57 compute-0 nova_compute[186662]: 2026-02-19 19:41:57.068 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Ensure instance console log exists: /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:41:57 compute-0 nova_compute[186662]: 2026-02-19 19:41:57.069 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:41:57 compute-0 nova_compute[186662]: 2026-02-19 19:41:57.069 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:41:57 compute-0 nova_compute[186662]: 2026-02-19 19:41:57.069 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:41:57 compute-0 nova_compute[186662]: 2026-02-19 19:41:57.433 186666 DEBUG oslo_concurrency.lockutils [req-a1e04675-a942-432c-baad-0c74b9d6571b req-49dd2f82-741f-4ab6-86da-2e311b418df6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-d28dc50a-3cd1-4dc1-98f1-384bc695e924" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:41:57 compute-0 nova_compute[186662]: 2026-02-19 19:41:57.434 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Acquired lock "refresh_cache-d28dc50a-3cd1-4dc1-98f1-384bc695e924" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:41:57 compute-0 nova_compute[186662]: 2026-02-19 19:41:57.434 186666 DEBUG nova.network.neutron [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:41:58 compute-0 nova_compute[186662]: 2026-02-19 19:41:58.670 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:58 compute-0 nova_compute[186662]: 2026-02-19 19:41:58.859 186666 DEBUG nova.network.neutron [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:41:59 compute-0 podman[214607]: 2026-02-19 19:41:59.277658287 +0000 UTC m=+0.053729151 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller)
Feb 19 19:41:59 compute-0 podman[196025]: time="2026-02-19T19:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:41:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:41:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Feb 19 19:41:59 compute-0 nova_compute[186662]: 2026-02-19 19:41:59.777 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:41:59 compute-0 nova_compute[186662]: 2026-02-19 19:41:59.830 186666 WARNING neutronclient.v2_0.client [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:41:59 compute-0 nova_compute[186662]: 2026-02-19 19:41:59.963 186666 DEBUG nova.network.neutron [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Updating instance_info_cache with network_info: [{"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.482 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Releasing lock "refresh_cache-d28dc50a-3cd1-4dc1-98f1-384bc695e924" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.483 186666 DEBUG nova.compute.manager [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Instance network_info: |[{"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.485 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Start _get_guest_xml network_info=[{"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.489 186666 WARNING nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.490 186666 DEBUG nova.virt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-755197348', uuid='d28dc50a-3cd1-4dc1-98f1-384bc695e924'), owner=OwnerMeta(userid='96e4d16f3a064f518ae335f4a1dc7345', username='tempest-TestExecuteVmWorkloadBalanceStrategy-695507613-project-admin', projectid='5cd356f35cb9438f878370c6e951113c', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-695507613'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771530120.4901829) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.494 186666 DEBUG nova.virt.libvirt.host [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.495 186666 DEBUG nova.virt.libvirt.host [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.498 186666 DEBUG nova.virt.libvirt.host [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.498 186666 DEBUG nova.virt.libvirt.host [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.499 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.500 186666 DEBUG nova.virt.hardware [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:33Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.500 186666 DEBUG nova.virt.hardware [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.500 186666 DEBUG nova.virt.hardware [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.501 186666 DEBUG nova.virt.hardware [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.501 186666 DEBUG nova.virt.hardware [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.501 186666 DEBUG nova.virt.hardware [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.501 186666 DEBUG nova.virt.hardware [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.502 186666 DEBUG nova.virt.hardware [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.502 186666 DEBUG nova.virt.hardware [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.502 186666 DEBUG nova.virt.hardware [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.502 186666 DEBUG nova.virt.hardware [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.507 186666 DEBUG nova.virt.libvirt.vif [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:41:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-755197348',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-755197348',id=21,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5cd356f35cb9438f878370c6e951113c',ramdisk_id='',reservation_id='r-1bjy2ija',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-695507613',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-695507613-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:41:55Z,user_data=None,user_id='96e4d16f3a064f518ae335f4a1dc7345',uuid=d28dc50a-3cd1-4dc1-98f1-384bc695e924,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.507 186666 DEBUG nova.network.os_vif_util [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Converting VIF {"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.508 186666 DEBUG nova.network.os_vif_util [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:b8:b3,bridge_name='br-int',has_traffic_filtering=True,id=41dd1a70-5d3b-4180-b1cc-b01d51f10ae9,network=Network(68c034ae-92c4-4919-9175-8992eee924de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41dd1a70-5d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:42:00 compute-0 nova_compute[186662]: 2026-02-19 19:42:00.509 186666 DEBUG nova.objects.instance [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Lazy-loading 'pci_devices' on Instance uuid d28dc50a-3cd1-4dc1-98f1-384bc695e924 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.016 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:42:01 compute-0 nova_compute[186662]:   <uuid>d28dc50a-3cd1-4dc1-98f1-384bc695e924</uuid>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   <name>instance-00000015</name>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   <memory>131072</memory>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-755197348</nova:name>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:42:00</nova:creationTime>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:42:01 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:42:01 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:user uuid="96e4d16f3a064f518ae335f4a1dc7345">tempest-TestExecuteVmWorkloadBalanceStrategy-695507613-project-admin</nova:user>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:project uuid="5cd356f35cb9438f878370c6e951113c">tempest-TestExecuteVmWorkloadBalanceStrategy-695507613</nova:project>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         <nova:port uuid="41dd1a70-5d3b-4180-b1cc-b01d51f10ae9">
Feb 19 19:42:01 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <system>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <entry name="serial">d28dc50a-3cd1-4dc1-98f1-384bc695e924</entry>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <entry name="uuid">d28dc50a-3cd1-4dc1-98f1-384bc695e924</entry>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     </system>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   <os>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   </os>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   <features>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   </features>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk.config"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:44:b8:b3"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <target dev="tap41dd1a70-5d"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/console.log" append="off"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <video>
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     </video>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:42:01 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:42:01 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:42:01 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:42:01 compute-0 nova_compute[186662]: </domain>
Feb 19 19:42:01 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.016 186666 DEBUG nova.compute.manager [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Preparing to wait for external event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.017 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.017 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.017 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.018 186666 DEBUG nova.virt.libvirt.vif [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:41:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-755197348',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-755197348',id=21,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5cd356f35cb9438f878370c6e951113c',ramdisk_id='',reservation_id='r-1bjy2ija',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-695507613',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-695507613-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:41:55Z,user_data=None,user_id='96e4d16f3a064f518ae335f4a1dc7345',uuid=d28dc50a-3cd1-4dc1-98f1-384bc695e924,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.018 186666 DEBUG nova.network.os_vif_util [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Converting VIF {"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.018 186666 DEBUG nova.network.os_vif_util [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:b8:b3,bridge_name='br-int',has_traffic_filtering=True,id=41dd1a70-5d3b-4180-b1cc-b01d51f10ae9,network=Network(68c034ae-92c4-4919-9175-8992eee924de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41dd1a70-5d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.019 186666 DEBUG os_vif [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:b8:b3,bridge_name='br-int',has_traffic_filtering=True,id=41dd1a70-5d3b-4180-b1cc-b01d51f10ae9,network=Network(68c034ae-92c4-4919-9175-8992eee924de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41dd1a70-5d') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.019 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.019 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.019 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.020 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.020 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'b0300b01-1638-5357-803c-e5077741ecb9', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.021 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.022 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.024 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.024 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41dd1a70-5d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.024 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap41dd1a70-5d, col_values=(('qos', UUID('a7a4ccc8-8c96-4bec-9198-090391c80143')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.024 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap41dd1a70-5d, col_values=(('external_ids', {'iface-id': '41dd1a70-5d3b-4180-b1cc-b01d51f10ae9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:b8:b3', 'vm-uuid': 'd28dc50a-3cd1-4dc1-98f1-384bc695e924'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.025 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:01 compute-0 NetworkManager[56519]: <info>  [1771530121.0265] manager: (tap41dd1a70-5d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.027 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.029 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:01 compute-0 nova_compute[186662]: 2026-02-19 19:42:01.030 186666 INFO os_vif [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:b8:b3,bridge_name='br-int',has_traffic_filtering=True,id=41dd1a70-5d3b-4180-b1cc-b01d51f10ae9,network=Network(68c034ae-92c4-4919-9175-8992eee924de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41dd1a70-5d')
Feb 19 19:42:01 compute-0 openstack_network_exporter[198916]: ERROR   19:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:42:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:42:01 compute-0 openstack_network_exporter[198916]: ERROR   19:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:42:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:42:02 compute-0 nova_compute[186662]: 2026-02-19 19:42:02.566 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:42:02 compute-0 nova_compute[186662]: 2026-02-19 19:42:02.566 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:42:02 compute-0 nova_compute[186662]: 2026-02-19 19:42:02.567 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] No VIF found with MAC fa:16:3e:44:b8:b3, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:42:02 compute-0 nova_compute[186662]: 2026-02-19 19:42:02.568 186666 INFO nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Using config drive
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.077 186666 WARNING neutronclient.v2_0.client [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:42:03 compute-0 podman[214636]: 2026-02-19 19:42:03.335611654 +0000 UTC m=+0.101221321 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.493 186666 INFO nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Creating config drive at /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk.config
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.496 186666 DEBUG oslo_concurrency.processutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpxl5pqz6_ execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.616 186666 DEBUG oslo_concurrency.processutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpxl5pqz6_" returned: 0 in 0.120s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:42:03 compute-0 kernel: tap41dd1a70-5d: entered promiscuous mode
Feb 19 19:42:03 compute-0 NetworkManager[56519]: <info>  [1771530123.6612] manager: (tap41dd1a70-5d): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.661 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:03 compute-0 ovn_controller[96653]: 2026-02-19T19:42:03Z|00160|binding|INFO|Claiming lport 41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 for this chassis.
Feb 19 19:42:03 compute-0 ovn_controller[96653]: 2026-02-19T19:42:03Z|00161|binding|INFO|41dd1a70-5d3b-4180-b1cc-b01d51f10ae9: Claiming fa:16:3e:44:b8:b3 10.100.0.8
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.666 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.669 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.684 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:b8:b3 10.100.0.8'], port_security=['fa:16:3e:44:b8:b3 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd28dc50a-3cd1-4dc1-98f1-384bc695e924', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-68c034ae-92c4-4919-9175-8992eee924de', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5cd356f35cb9438f878370c6e951113c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f2b8d61b-76c5-4a92-99b9-2ca57ab4c309', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db5091ff-cd1b-42d9-b277-948d5e6e25d3, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=41dd1a70-5d3b-4180-b1cc-b01d51f10ae9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:42:03 compute-0 systemd-udevd[214678]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.685 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 in datapath 68c034ae-92c4-4919-9175-8992eee924de bound to our chassis
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.685 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 68c034ae-92c4-4919-9175-8992eee924de
Feb 19 19:42:03 compute-0 systemd-machined[156014]: New machine qemu-15-instance-00000015.
Feb 19 19:42:03 compute-0 ovn_controller[96653]: 2026-02-19T19:42:03Z|00162|binding|INFO|Setting lport 41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 ovn-installed in OVS
Feb 19 19:42:03 compute-0 ovn_controller[96653]: 2026-02-19T19:42:03Z|00163|binding|INFO|Setting lport 41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 up in Southbound
Feb 19 19:42:03 compute-0 NetworkManager[56519]: <info>  [1771530123.6953] device (tap41dd1a70-5d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:42:03 compute-0 NetworkManager[56519]: <info>  [1771530123.6958] device (tap41dd1a70-5d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.696 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[3198cb8f-54cc-47f5-8d79-2279e1be993b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.697 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap68c034ae-91 in ovnmeta-68c034ae-92c4-4919-9175-8992eee924de namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.697 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.700 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap68c034ae-90 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.700 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[4b801f6a-d15b-4e67-b6fe-b6b0786e556b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.701 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[6d3cd0ad-4884-41e5-b9c4-82543c93f995]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000015.
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.710 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[f69e996a-b4da-4576-8089-b861e5edd0e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.725 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ba2b48d8-a041-4c65-9502-8ede5dc4d6b5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.744 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[5bc97a8b-4d79-4e5d-89c1-f5af3fbc5545]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.747 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[27984946-d7b9-4f7a-8ccb-9bf337497276]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 NetworkManager[56519]: <info>  [1771530123.7481] manager: (tap68c034ae-90): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Feb 19 19:42:03 compute-0 systemd-udevd[214682]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.766 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[5d4e34a6-8386-40aa-8e5f-c5159e6eca43]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.768 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[ac31bd14-c99c-40d7-b791-560de739771e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 NetworkManager[56519]: <info>  [1771530123.7834] device (tap68c034ae-90): carrier: link connected
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.784 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[407d9c1d-ce0c-4d0d-8f01-e26271922168]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.795 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[49e26c65-c33d-4f08-a002-991921f3de39]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap68c034ae-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:df:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439334, 'reachable_time': 42900, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214712, 'error': None, 'target': 'ovnmeta-68c034ae-92c4-4919-9175-8992eee924de', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.805 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b7c628d6-7faa-410c-b897-242d81de29ca]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:df1e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439334, 'tstamp': 439334}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214713, 'error': None, 'target': 'ovnmeta-68c034ae-92c4-4919-9175-8992eee924de', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.814 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[365049c3-c681-4d08-9485-0dc07458529f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap68c034ae-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:df:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439334, 'reachable_time': 42900, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214714, 'error': None, 'target': 'ovnmeta-68c034ae-92c4-4919-9175-8992eee924de', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.829 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b1baa6-1fd1-4c21-a87a-68d5ea349d96]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.853 186666 DEBUG nova.compute.manager [req-8308c3c8-cfb7-4d43-a36b-0479ac732fb5 req-1c246b7e-fb02-4a00-be83-635b2fe1d7e4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.854 186666 DEBUG oslo_concurrency.lockutils [req-8308c3c8-cfb7-4d43-a36b-0479ac732fb5 req-1c246b7e-fb02-4a00-be83-635b2fe1d7e4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.854 186666 DEBUG oslo_concurrency.lockutils [req-8308c3c8-cfb7-4d43-a36b-0479ac732fb5 req-1c246b7e-fb02-4a00-be83-635b2fe1d7e4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.854 186666 DEBUG oslo_concurrency.lockutils [req-8308c3c8-cfb7-4d43-a36b-0479ac732fb5 req-1c246b7e-fb02-4a00-be83-635b2fe1d7e4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.854 186666 DEBUG nova.compute.manager [req-8308c3c8-cfb7-4d43-a36b-0479ac732fb5 req-1c246b7e-fb02-4a00-be83-635b2fe1d7e4 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Processing event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.859 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[faac6c77-645d-4725-b2cd-eafa9af816e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.860 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68c034ae-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.860 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.860 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap68c034ae-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.862 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:03 compute-0 kernel: tap68c034ae-90: entered promiscuous mode
Feb 19 19:42:03 compute-0 NetworkManager[56519]: <info>  [1771530123.8627] manager: (tap68c034ae-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.863 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.865 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap68c034ae-90, col_values=(('external_ids', {'iface-id': 'ff9c6a14-aac0-4622-8df4-fda6d4294002'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.866 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:03 compute-0 ovn_controller[96653]: 2026-02-19T19:42:03Z|00164|binding|INFO|Releasing lport ff9c6a14-aac0-4622-8df4-fda6d4294002 from this chassis (sb_readonly=0)
Feb 19 19:42:03 compute-0 nova_compute[186662]: 2026-02-19 19:42:03.869 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.869 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c59711-ecae-4e0d-b865-ebeb77c0b307]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.870 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/68c034ae-92c4-4919-9175-8992eee924de.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/68c034ae-92c4-4919-9175-8992eee924de.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.870 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/68c034ae-92c4-4919-9175-8992eee924de.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/68c034ae-92c4-4919-9175-8992eee924de.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.870 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 68c034ae-92c4-4919-9175-8992eee924de disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.870 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/68c034ae-92c4-4919-9175-8992eee924de.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/68c034ae-92c4-4919-9175-8992eee924de.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.870 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7f6d9f66-c29e-4ab7-8043-019482ff47fe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.870 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/68c034ae-92c4-4919-9175-8992eee924de.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/68c034ae-92c4-4919-9175-8992eee924de.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.870 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[0253bc5f-3d45-4532-9407-b99c3873449a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.871 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-68c034ae-92c4-4919-9175-8992eee924de
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/68c034ae-92c4-4919-9175-8992eee924de.pid.haproxy
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID 68c034ae-92c4-4919-9175-8992eee924de
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:42:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:03.871 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-68c034ae-92c4-4919-9175-8992eee924de', 'env', 'PROCESS_TAG=haproxy-68c034ae-92c4-4919-9175-8992eee924de', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/68c034ae-92c4-4919-9175-8992eee924de.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:42:04 compute-0 nova_compute[186662]: 2026-02-19 19:42:04.013 186666 DEBUG nova.compute.manager [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:42:04 compute-0 nova_compute[186662]: 2026-02-19 19:42:04.016 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Feb 19 19:42:04 compute-0 nova_compute[186662]: 2026-02-19 19:42:04.019 186666 INFO nova.virt.libvirt.driver [-] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Instance spawned successfully.
Feb 19 19:42:04 compute-0 nova_compute[186662]: 2026-02-19 19:42:04.020 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Feb 19 19:42:04 compute-0 podman[214753]: 2026-02-19 19:42:04.187812175 +0000 UTC m=+0.049580972 container create 85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 19 19:42:04 compute-0 systemd[1]: Started libpod-conmon-85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1.scope.
Feb 19 19:42:04 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:42:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/affa1aa7c0c6fad315695633bf7ce7cf0a86aae5a0eae8afc79d1fd676a427bf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:42:04 compute-0 podman[214753]: 2026-02-19 19:42:04.25776447 +0000 UTC m=+0.119533277 container init 85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 19 19:42:04 compute-0 podman[214753]: 2026-02-19 19:42:04.161827636 +0000 UTC m=+0.023596453 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:42:04 compute-0 podman[214753]: 2026-02-19 19:42:04.261418867 +0000 UTC m=+0.123187674 container start 85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Feb 19 19:42:04 compute-0 neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de[214768]: [NOTICE]   (214772) : New worker (214774) forked
Feb 19 19:42:04 compute-0 neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de[214768]: [NOTICE]   (214772) : Loading success.
Feb 19 19:42:04 compute-0 nova_compute[186662]: 2026-02-19 19:42:04.529 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:42:04 compute-0 nova_compute[186662]: 2026-02-19 19:42:04.529 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:42:04 compute-0 nova_compute[186662]: 2026-02-19 19:42:04.530 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:42:04 compute-0 nova_compute[186662]: 2026-02-19 19:42:04.530 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:42:04 compute-0 nova_compute[186662]: 2026-02-19 19:42:04.530 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:42:04 compute-0 nova_compute[186662]: 2026-02-19 19:42:04.531 186666 DEBUG nova.virt.libvirt.driver [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:42:04 compute-0 nova_compute[186662]: 2026-02-19 19:42:04.779 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:05 compute-0 nova_compute[186662]: 2026-02-19 19:42:05.041 186666 INFO nova.compute.manager [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Took 8.28 seconds to spawn the instance on the hypervisor.
Feb 19 19:42:05 compute-0 nova_compute[186662]: 2026-02-19 19:42:05.041 186666 DEBUG nova.compute.manager [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:42:05 compute-0 nova_compute[186662]: 2026-02-19 19:42:05.574 186666 INFO nova.compute.manager [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Took 13.63 seconds to build instance.
Feb 19 19:42:05 compute-0 nova_compute[186662]: 2026-02-19 19:42:05.920 186666 DEBUG nova.compute.manager [req-76ed72e9-8694-4f56-aa0c-9d248ae38e98 req-cb71707f-5605-4c07-aec9-3e7dcbef4804 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:42:05 compute-0 nova_compute[186662]: 2026-02-19 19:42:05.920 186666 DEBUG oslo_concurrency.lockutils [req-76ed72e9-8694-4f56-aa0c-9d248ae38e98 req-cb71707f-5605-4c07-aec9-3e7dcbef4804 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:05 compute-0 nova_compute[186662]: 2026-02-19 19:42:05.920 186666 DEBUG oslo_concurrency.lockutils [req-76ed72e9-8694-4f56-aa0c-9d248ae38e98 req-cb71707f-5605-4c07-aec9-3e7dcbef4804 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:05 compute-0 nova_compute[186662]: 2026-02-19 19:42:05.921 186666 DEBUG oslo_concurrency.lockutils [req-76ed72e9-8694-4f56-aa0c-9d248ae38e98 req-cb71707f-5605-4c07-aec9-3e7dcbef4804 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:05 compute-0 nova_compute[186662]: 2026-02-19 19:42:05.921 186666 DEBUG nova.compute.manager [req-76ed72e9-8694-4f56-aa0c-9d248ae38e98 req-cb71707f-5605-4c07-aec9-3e7dcbef4804 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] No waiting events found dispatching network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:42:05 compute-0 nova_compute[186662]: 2026-02-19 19:42:05.921 186666 WARNING nova.compute.manager [req-76ed72e9-8694-4f56-aa0c-9d248ae38e98 req-cb71707f-5605-4c07-aec9-3e7dcbef4804 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received unexpected event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 for instance with vm_state active and task_state None.
Feb 19 19:42:06 compute-0 nova_compute[186662]: 2026-02-19 19:42:06.026 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:06 compute-0 nova_compute[186662]: 2026-02-19 19:42:06.079 186666 DEBUG oslo_concurrency.lockutils [None req-8d8ac08a-e75b-4dae-a6e4-6c17d88cc40d 96e4d16f3a064f518ae335f4a1dc7345 5cd356f35cb9438f878370c6e951113c - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.152s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:09 compute-0 nova_compute[186662]: 2026-02-19 19:42:09.783 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:11 compute-0 nova_compute[186662]: 2026-02-19 19:42:11.029 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:14 compute-0 nova_compute[186662]: 2026-02-19 19:42:14.784 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:16 compute-0 nova_compute[186662]: 2026-02-19 19:42:16.032 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:16 compute-0 ovn_controller[96653]: 2026-02-19T19:42:16Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:44:b8:b3 10.100.0.8
Feb 19 19:42:16 compute-0 ovn_controller[96653]: 2026-02-19T19:42:16Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:44:b8:b3 10.100.0.8
Feb 19 19:42:19 compute-0 nova_compute[186662]: 2026-02-19 19:42:19.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:42:19 compute-0 nova_compute[186662]: 2026-02-19 19:42:19.785 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:21 compute-0 nova_compute[186662]: 2026-02-19 19:42:21.034 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:21 compute-0 nova_compute[186662]: 2026-02-19 19:42:21.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:42:23 compute-0 podman[214796]: 2026-02-19 19:42:23.31370555 +0000 UTC m=+0.081298146 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:42:23 compute-0 nova_compute[186662]: 2026-02-19 19:42:23.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:42:24 compute-0 nova_compute[186662]: 2026-02-19 19:42:24.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:42:24 compute-0 nova_compute[186662]: 2026-02-19 19:42:24.787 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:25 compute-0 nova_compute[186662]: 2026-02-19 19:42:25.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:42:25 compute-0 nova_compute[186662]: 2026-02-19 19:42:25.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:42:26 compute-0 nova_compute[186662]: 2026-02-19 19:42:26.036 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:27 compute-0 podman[214819]: 2026-02-19 19:42:27.260555862 +0000 UTC m=+0.041866688 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Feb 19 19:42:27 compute-0 nova_compute[186662]: 2026-02-19 19:42:27.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:42:28 compute-0 nova_compute[186662]: 2026-02-19 19:42:28.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:42:28 compute-0 nova_compute[186662]: 2026-02-19 19:42:28.664 186666 DEBUG nova.compute.manager [None req-adb47003-fab5-410c-b44b-4626e0b28e9c 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Feb 19 19:42:28 compute-0 nova_compute[186662]: 2026-02-19 19:42:28.715 186666 DEBUG nova.compute.provider_tree [None req-adb47003-fab5-410c-b44b-4626e0b28e9c 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Updating resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 generation from 26 to 31 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Feb 19 19:42:29 compute-0 sshd-session[214843]: Received disconnect from 91.224.92.108 port 41160:11:  [preauth]
Feb 19 19:42:29 compute-0 sshd-session[214843]: Disconnected from authenticating user root 91.224.92.108 port 41160 [preauth]
Feb 19 19:42:29 compute-0 nova_compute[186662]: 2026-02-19 19:42:29.089 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:29 compute-0 nova_compute[186662]: 2026-02-19 19:42:29.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:29 compute-0 nova_compute[186662]: 2026-02-19 19:42:29.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:29 compute-0 nova_compute[186662]: 2026-02-19 19:42:29.090 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:42:29 compute-0 podman[196025]: time="2026-02-19T19:42:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:42:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:42:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:42:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:42:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2660 "" "Go-http-client/1.1"
Feb 19 19:42:29 compute-0 nova_compute[186662]: 2026-02-19 19:42:29.789 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:30 compute-0 nova_compute[186662]: 2026-02-19 19:42:30.129 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:42:30 compute-0 nova_compute[186662]: 2026-02-19 19:42:30.187 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:42:30 compute-0 nova_compute[186662]: 2026-02-19 19:42:30.188 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:42:30 compute-0 nova_compute[186662]: 2026-02-19 19:42:30.240 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:42:30 compute-0 podman[214850]: 2026-02-19 19:42:30.29206129 +0000 UTC m=+0.067622422 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.43.0)
Feb 19 19:42:30 compute-0 nova_compute[186662]: 2026-02-19 19:42:30.357 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:42:30 compute-0 nova_compute[186662]: 2026-02-19 19:42:30.358 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:42:30 compute-0 nova_compute[186662]: 2026-02-19 19:42:30.373 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:42:30 compute-0 nova_compute[186662]: 2026-02-19 19:42:30.373 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5641MB free_disk=72.94737243652344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:42:30 compute-0 nova_compute[186662]: 2026-02-19 19:42:30.374 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:30 compute-0 nova_compute[186662]: 2026-02-19 19:42:30.374 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:31 compute-0 nova_compute[186662]: 2026-02-19 19:42:31.039 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:31 compute-0 openstack_network_exporter[198916]: ERROR   19:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:42:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:42:31 compute-0 openstack_network_exporter[198916]: ERROR   19:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:42:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:42:31 compute-0 nova_compute[186662]: 2026-02-19 19:42:31.945 186666 INFO nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance a6ff9943-ba78-421e-8dfb-404f3e720229 has allocations against this compute host but is not found in the database.
Feb 19 19:42:31 compute-0 nova_compute[186662]: 2026-02-19 19:42:31.946 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:42:31 compute-0 nova_compute[186662]: 2026-02-19 19:42:31.946 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:42:30 up  1:13,  0 user,  load average: 0.32, 0.29, 0.33\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_5cd356f35cb9438f878370c6e951113c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:42:32 compute-0 nova_compute[186662]: 2026-02-19 19:42:32.000 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:42:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:32.145 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:32.145 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:32.146 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:32 compute-0 nova_compute[186662]: 2026-02-19 19:42:32.508 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:42:33 compute-0 nova_compute[186662]: 2026-02-19 19:42:33.015 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:42:33 compute-0 nova_compute[186662]: 2026-02-19 19:42:33.016 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.642s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:33 compute-0 ovn_controller[96653]: 2026-02-19T19:42:33Z|00165|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Feb 19 19:42:34 compute-0 nova_compute[186662]: 2026-02-19 19:42:34.016 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:42:34 compute-0 podman[214880]: 2026-02-19 19:42:34.25946451 +0000 UTC m=+0.040958136 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:42:34 compute-0 nova_compute[186662]: 2026-02-19 19:42:34.791 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:35 compute-0 sshd-session[214904]: Invalid user systemd from 189.165.79.177 port 40146
Feb 19 19:42:35 compute-0 sshd-session[214904]: Received disconnect from 189.165.79.177 port 40146:11: Bye Bye [preauth]
Feb 19 19:42:35 compute-0 sshd-session[214904]: Disconnected from invalid user systemd 189.165.79.177 port 40146 [preauth]
Feb 19 19:42:36 compute-0 nova_compute[186662]: 2026-02-19 19:42:36.043 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:36 compute-0 nova_compute[186662]: 2026-02-19 19:42:36.730 186666 DEBUG nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Check if temp file /var/lib/nova/instances/tmphbjo_upr exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Feb 19 19:42:36 compute-0 nova_compute[186662]: 2026-02-19 19:42:36.736 186666 DEBUG nova.compute.manager [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphbjo_upr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d28dc50a-3cd1-4dc1-98f1-384bc695e924',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Feb 19 19:42:39 compute-0 nova_compute[186662]: 2026-02-19 19:42:39.792 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:41 compute-0 nova_compute[186662]: 2026-02-19 19:42:41.046 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:41 compute-0 nova_compute[186662]: 2026-02-19 19:42:41.161 186666 DEBUG oslo_concurrency.processutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:42:41 compute-0 nova_compute[186662]: 2026-02-19 19:42:41.220 186666 DEBUG oslo_concurrency.processutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:42:41 compute-0 nova_compute[186662]: 2026-02-19 19:42:41.221 186666 DEBUG oslo_concurrency.processutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:42:41 compute-0 nova_compute[186662]: 2026-02-19 19:42:41.263 186666 DEBUG oslo_concurrency.processutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:42:41 compute-0 nova_compute[186662]: 2026-02-19 19:42:41.264 186666 DEBUG nova.compute.manager [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Preparing to wait for external event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:42:41 compute-0 nova_compute[186662]: 2026-02-19 19:42:41.265 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:41 compute-0 nova_compute[186662]: 2026-02-19 19:42:41.265 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:41 compute-0 nova_compute[186662]: 2026-02-19 19:42:41.265 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:44 compute-0 nova_compute[186662]: 2026-02-19 19:42:44.794 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:46 compute-0 nova_compute[186662]: 2026-02-19 19:42:46.048 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:48 compute-0 nova_compute[186662]: 2026-02-19 19:42:48.138 186666 DEBUG nova.compute.manager [req-4e6c0f21-32c1-4a47-b6db-e0c3143d34d5 req-6d9fd0a0-3bbe-42a4-b87e-eead542126ab 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-unplugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:42:48 compute-0 nova_compute[186662]: 2026-02-19 19:42:48.138 186666 DEBUG oslo_concurrency.lockutils [req-4e6c0f21-32c1-4a47-b6db-e0c3143d34d5 req-6d9fd0a0-3bbe-42a4-b87e-eead542126ab 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:48 compute-0 nova_compute[186662]: 2026-02-19 19:42:48.139 186666 DEBUG oslo_concurrency.lockutils [req-4e6c0f21-32c1-4a47-b6db-e0c3143d34d5 req-6d9fd0a0-3bbe-42a4-b87e-eead542126ab 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:48 compute-0 nova_compute[186662]: 2026-02-19 19:42:48.139 186666 DEBUG oslo_concurrency.lockutils [req-4e6c0f21-32c1-4a47-b6db-e0c3143d34d5 req-6d9fd0a0-3bbe-42a4-b87e-eead542126ab 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:48 compute-0 nova_compute[186662]: 2026-02-19 19:42:48.140 186666 DEBUG nova.compute.manager [req-4e6c0f21-32c1-4a47-b6db-e0c3143d34d5 req-6d9fd0a0-3bbe-42a4-b87e-eead542126ab 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] No event matching network-vif-unplugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 in dict_keys([('network-vif-plugged', '41dd1a70-5d3b-4180-b1cc-b01d51f10ae9')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Feb 19 19:42:48 compute-0 nova_compute[186662]: 2026-02-19 19:42:48.140 186666 DEBUG nova.compute.manager [req-4e6c0f21-32c1-4a47-b6db-e0c3143d34d5 req-6d9fd0a0-3bbe-42a4-b87e-eead542126ab 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-unplugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:42:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:48.961 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:42:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:48.962 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:42:48 compute-0 nova_compute[186662]: 2026-02-19 19:42:48.962 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:49 compute-0 nova_compute[186662]: 2026-02-19 19:42:49.795 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:50 compute-0 nova_compute[186662]: 2026-02-19 19:42:50.204 186666 DEBUG nova.compute.manager [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:42:50 compute-0 nova_compute[186662]: 2026-02-19 19:42:50.204 186666 DEBUG oslo_concurrency.lockutils [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:50 compute-0 nova_compute[186662]: 2026-02-19 19:42:50.205 186666 DEBUG oslo_concurrency.lockutils [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:50 compute-0 nova_compute[186662]: 2026-02-19 19:42:50.205 186666 DEBUG oslo_concurrency.lockutils [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:50 compute-0 nova_compute[186662]: 2026-02-19 19:42:50.205 186666 DEBUG nova.compute.manager [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Processing event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:42:50 compute-0 nova_compute[186662]: 2026-02-19 19:42:50.205 186666 DEBUG nova.compute.manager [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-changed-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:42:50 compute-0 nova_compute[186662]: 2026-02-19 19:42:50.206 186666 DEBUG nova.compute.manager [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Refreshing instance network info cache due to event network-changed-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:42:50 compute-0 nova_compute[186662]: 2026-02-19 19:42:50.206 186666 DEBUG oslo_concurrency.lockutils [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-d28dc50a-3cd1-4dc1-98f1-384bc695e924" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:42:50 compute-0 nova_compute[186662]: 2026-02-19 19:42:50.206 186666 DEBUG oslo_concurrency.lockutils [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-d28dc50a-3cd1-4dc1-98f1-384bc695e924" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:42:50 compute-0 nova_compute[186662]: 2026-02-19 19:42:50.206 186666 DEBUG nova.network.neutron [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Refreshing network info cache for port 41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:42:50 compute-0 nova_compute[186662]: 2026-02-19 19:42:50.712 186666 WARNING neutronclient.v2_0.client [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:42:50 compute-0 nova_compute[186662]: 2026-02-19 19:42:50.800 186666 INFO nova.compute.manager [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Took 9.53 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 19 19:42:50 compute-0 nova_compute[186662]: 2026-02-19 19:42:50.800 186666 DEBUG nova.compute.manager [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:42:51 compute-0 nova_compute[186662]: 2026-02-19 19:42:51.049 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:51 compute-0 nova_compute[186662]: 2026-02-19 19:42:51.308 186666 DEBUG nova.compute.manager [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphbjo_upr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d28dc50a-3cd1-4dc1-98f1-384bc695e924',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(a6ff9943-ba78-421e-8dfb-404f3e720229),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Feb 19 19:42:51 compute-0 nova_compute[186662]: 2026-02-19 19:42:51.486 186666 WARNING neutronclient.v2_0.client [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:42:51 compute-0 nova_compute[186662]: 2026-02-19 19:42:51.626 186666 DEBUG nova.network.neutron [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Updated VIF entry in instance network info cache for port 41dd1a70-5d3b-4180-b1cc-b01d51f10ae9. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Feb 19 19:42:51 compute-0 nova_compute[186662]: 2026-02-19 19:42:51.627 186666 DEBUG nova.network.neutron [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Updating instance_info_cache with network_info: [{"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:42:52 compute-0 nova_compute[186662]: 2026-02-19 19:42:52.744 186666 DEBUG nova.objects.instance [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid d28dc50a-3cd1-4dc1-98f1-384bc695e924 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:42:52 compute-0 nova_compute[186662]: 2026-02-19 19:42:52.745 186666 DEBUG nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Feb 19 19:42:52 compute-0 nova_compute[186662]: 2026-02-19 19:42:52.746 186666 DEBUG nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Feb 19 19:42:52 compute-0 nova_compute[186662]: 2026-02-19 19:42:52.747 186666 DEBUG nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Feb 19 19:42:52 compute-0 nova_compute[186662]: 2026-02-19 19:42:52.984 186666 DEBUG oslo_concurrency.lockutils [req-24ce9561-54d4-4fb0-a691-a38b1dd96436 req-d9371c9f-d1c9-49c3-9498-48f7238385c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-d28dc50a-3cd1-4dc1-98f1-384bc695e924" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:42:53 compute-0 nova_compute[186662]: 2026-02-19 19:42:53.249 186666 DEBUG nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Feb 19 19:42:53 compute-0 nova_compute[186662]: 2026-02-19 19:42:53.250 186666 DEBUG nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Feb 19 19:42:53 compute-0 nova_compute[186662]: 2026-02-19 19:42:53.354 186666 DEBUG nova.virt.libvirt.vif [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:41:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-755197348',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-755197348',id=21,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:42:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5cd356f35cb9438f878370c6e951113c',ramdisk_id='',reservation_id='r-1bjy2ija',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-695507613',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-695507613-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:42:05Z,user_data=None,user_id='96e4d16f3a064f518ae335f4a1dc7345',uuid=d28dc50a-3cd1-4dc1-98f1-384bc695e924,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:42:53 compute-0 nova_compute[186662]: 2026-02-19 19:42:53.355 186666 DEBUG nova.network.os_vif_util [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:42:53 compute-0 nova_compute[186662]: 2026-02-19 19:42:53.355 186666 DEBUG nova.network.os_vif_util [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:b8:b3,bridge_name='br-int',has_traffic_filtering=True,id=41dd1a70-5d3b-4180-b1cc-b01d51f10ae9,network=Network(68c034ae-92c4-4919-9175-8992eee924de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41dd1a70-5d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:42:53 compute-0 nova_compute[186662]: 2026-02-19 19:42:53.356 186666 DEBUG nova.virt.libvirt.migration [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Updating guest XML with vif config: <interface type="ethernet">
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <mac address="fa:16:3e:44:b8:b3"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <model type="virtio"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <mtu size="1442"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <target dev="tap41dd1a70-5d"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]: </interface>
Feb 19 19:42:53 compute-0 nova_compute[186662]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Feb 19 19:42:53 compute-0 nova_compute[186662]: 2026-02-19 19:42:53.356 186666 DEBUG nova.virt.libvirt.migration [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <name>instance-00000015</name>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <uuid>d28dc50a-3cd1-4dc1-98f1-384bc695e924</uuid>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-755197348</nova:name>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:42:00</nova:creationTime>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:42:53 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:42:53 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:user uuid="96e4d16f3a064f518ae335f4a1dc7345">tempest-TestExecuteVmWorkloadBalanceStrategy-695507613-project-admin</nova:user>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:project uuid="5cd356f35cb9438f878370c6e951113c">tempest-TestExecuteVmWorkloadBalanceStrategy-695507613</nova:project>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:port uuid="41dd1a70-5d3b-4180-b1cc-b01d51f10ae9">
Feb 19 19:42:53 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <memory unit="KiB">131072</memory>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <currentMemory unit="KiB">131072</currentMemory>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <vcpu placement="static">1</vcpu>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <resource>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <partition>/machine</partition>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </resource>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <system>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="serial">d28dc50a-3cd1-4dc1-98f1-384bc695e924</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="uuid">d28dc50a-3cd1-4dc1-98f1-384bc695e924</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </system>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <os>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </os>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <features>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <vmcoreinfo state="on"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </features>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact" check="partial">
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <model fallback="allow">Nehalem</model>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <on_poweroff>destroy</on_poweroff>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <on_reboot>restart</on_reboot>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <on_crash>destroy</on_crash>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk.config"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <readonly/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="0" model="pcie-root"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="1" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="1" port="0x10"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="2" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="2" port="0x11"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="3" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="3" port="0x12"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="4" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="4" port="0x13"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="5" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="5" port="0x14"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="6" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="6" port="0x15"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="7" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="7" port="0x16"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="8" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="8" port="0x17"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="9" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="9" port="0x18"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="10" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="10" port="0x19"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="11" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="11" port="0x1a"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="12" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="12" port="0x1b"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="13" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="13" port="0x1c"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="14" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="14" port="0x1d"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="15" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="15" port="0x1e"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="16" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="16" port="0x1f"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="17" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="17" port="0x20"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="18" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="18" port="0x21"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="19" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="19" port="0x22"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="20" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="20" port="0x23"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="21" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="21" port="0x24"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="22" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="22" port="0x25"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="23" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="23" port="0x26"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="24" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="24" port="0x27"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="25" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="25" port="0x28"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-pci-bridge"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="usb" index="0" model="piix3-uhci">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="sata" index="0">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <interface type="ethernet"><mac address="fa:16:3e:44:b8:b3"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap41dd1a70-5d"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </interface><serial type="pty">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/console.log" append="off"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target type="isa-serial" port="0">
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <model name="isa-serial"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </target>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <console type="pty">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/console.log" append="off"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target type="serial" port="0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </console>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="usb" bus="0" port="1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </input>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <input type="mouse" bus="ps2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <listen type="address" address="::"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <video>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model type="virtio" heads="1" primary="yes"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </video>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]: </domain>
Feb 19 19:42:53 compute-0 nova_compute[186662]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Feb 19 19:42:53 compute-0 nova_compute[186662]: 2026-02-19 19:42:53.358 186666 DEBUG nova.virt.libvirt.migration [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <name>instance-00000015</name>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <uuid>d28dc50a-3cd1-4dc1-98f1-384bc695e924</uuid>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-755197348</nova:name>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:42:00</nova:creationTime>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:42:53 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:42:53 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:user uuid="96e4d16f3a064f518ae335f4a1dc7345">tempest-TestExecuteVmWorkloadBalanceStrategy-695507613-project-admin</nova:user>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:project uuid="5cd356f35cb9438f878370c6e951113c">tempest-TestExecuteVmWorkloadBalanceStrategy-695507613</nova:project>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:port uuid="41dd1a70-5d3b-4180-b1cc-b01d51f10ae9">
Feb 19 19:42:53 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <memory unit="KiB">131072</memory>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <currentMemory unit="KiB">131072</currentMemory>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <vcpu placement="static">1</vcpu>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <resource>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <partition>/machine</partition>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </resource>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <system>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="serial">d28dc50a-3cd1-4dc1-98f1-384bc695e924</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="uuid">d28dc50a-3cd1-4dc1-98f1-384bc695e924</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </system>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <os>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </os>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <features>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <vmcoreinfo state="on"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </features>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact" check="partial">
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <model fallback="allow">Nehalem</model>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <on_poweroff>destroy</on_poweroff>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <on_reboot>restart</on_reboot>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <on_crash>destroy</on_crash>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk.config"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <readonly/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="0" model="pcie-root"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="1" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="1" port="0x10"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="2" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="2" port="0x11"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="3" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="3" port="0x12"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="4" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="4" port="0x13"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="5" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="5" port="0x14"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="6" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="6" port="0x15"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="7" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="7" port="0x16"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="8" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="8" port="0x17"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="9" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="9" port="0x18"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="10" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="10" port="0x19"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="11" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="11" port="0x1a"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="12" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="12" port="0x1b"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="13" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="13" port="0x1c"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="14" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="14" port="0x1d"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="15" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="15" port="0x1e"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="16" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="16" port="0x1f"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="17" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="17" port="0x20"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="18" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="18" port="0x21"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="19" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="19" port="0x22"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="20" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="20" port="0x23"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="21" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="21" port="0x24"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="22" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="22" port="0x25"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="23" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="23" port="0x26"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="24" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="24" port="0x27"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="25" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="25" port="0x28"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-pci-bridge"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="usb" index="0" model="piix3-uhci">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="sata" index="0">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <interface type="ethernet"><mac address="fa:16:3e:44:b8:b3"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap41dd1a70-5d"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </interface><serial type="pty">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/console.log" append="off"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target type="isa-serial" port="0">
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <model name="isa-serial"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </target>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <console type="pty">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/console.log" append="off"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target type="serial" port="0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </console>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="usb" bus="0" port="1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </input>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <input type="mouse" bus="ps2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <listen type="address" address="::"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <video>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model type="virtio" heads="1" primary="yes"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </video>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]: </domain>
Feb 19 19:42:53 compute-0 nova_compute[186662]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Feb 19 19:42:53 compute-0 nova_compute[186662]: 2026-02-19 19:42:53.359 186666 DEBUG nova.virt.libvirt.migration [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] _update_pci_xml output xml=<domain type="kvm">
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <name>instance-00000015</name>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <uuid>d28dc50a-3cd1-4dc1-98f1-384bc695e924</uuid>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-755197348</nova:name>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:42:00</nova:creationTime>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:42:53 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:42:53 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:user uuid="96e4d16f3a064f518ae335f4a1dc7345">tempest-TestExecuteVmWorkloadBalanceStrategy-695507613-project-admin</nova:user>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:project uuid="5cd356f35cb9438f878370c6e951113c">tempest-TestExecuteVmWorkloadBalanceStrategy-695507613</nova:project>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <nova:port uuid="41dd1a70-5d3b-4180-b1cc-b01d51f10ae9">
Feb 19 19:42:53 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <memory unit="KiB">131072</memory>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <currentMemory unit="KiB">131072</currentMemory>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <vcpu placement="static">1</vcpu>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <resource>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <partition>/machine</partition>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </resource>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <system>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="serial">d28dc50a-3cd1-4dc1-98f1-384bc695e924</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="uuid">d28dc50a-3cd1-4dc1-98f1-384bc695e924</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </system>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <os>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </os>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <features>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <vmcoreinfo state="on"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </features>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact" check="partial">
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <model fallback="allow">Nehalem</model>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <on_poweroff>destroy</on_poweroff>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <on_reboot>restart</on_reboot>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <on_crash>destroy</on_crash>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/disk.config"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <readonly/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="0" model="pcie-root"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="1" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="1" port="0x10"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="2" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="2" port="0x11"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="3" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="3" port="0x12"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="4" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="4" port="0x13"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="5" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="5" port="0x14"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="6" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="6" port="0x15"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="7" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="7" port="0x16"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="8" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="8" port="0x17"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="9" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="9" port="0x18"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="10" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="10" port="0x19"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="11" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="11" port="0x1a"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="12" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="12" port="0x1b"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="13" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="13" port="0x1c"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="14" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="14" port="0x1d"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="15" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="15" port="0x1e"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="16" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="16" port="0x1f"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="17" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="17" port="0x20"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="18" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="18" port="0x21"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="19" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="19" port="0x22"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="20" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="20" port="0x23"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="21" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="21" port="0x24"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="22" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="22" port="0x25"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="23" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="23" port="0x26"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="24" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="24" port="0x27"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="25" model="pcie-root-port">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-root-port"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target chassis="25" port="0x28"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model name="pcie-pci-bridge"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="usb" index="0" model="piix3-uhci">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <controller type="sata" index="0">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </controller>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <interface type="ethernet"><mac address="fa:16:3e:44:b8:b3"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap41dd1a70-5d"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </interface><serial type="pty">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/console.log" append="off"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target type="isa-serial" port="0">
Feb 19 19:42:53 compute-0 nova_compute[186662]:         <model name="isa-serial"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       </target>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <console type="pty">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924/console.log" append="off"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <target type="serial" port="0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </console>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="usb" bus="0" port="1"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </input>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <input type="mouse" bus="ps2"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <listen type="address" address="::"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </graphics>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <video>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <model type="virtio" heads="1" primary="yes"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </video>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:42:53 compute-0 nova_compute[186662]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:42:53 compute-0 nova_compute[186662]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Feb 19 19:42:53 compute-0 nova_compute[186662]: </domain>
Feb 19 19:42:53 compute-0 nova_compute[186662]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Feb 19 19:42:53 compute-0 nova_compute[186662]: 2026-02-19 19:42:53.360 186666 DEBUG nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Feb 19 19:42:53 compute-0 nova_compute[186662]: 2026-02-19 19:42:53.752 186666 DEBUG nova.virt.libvirt.migration [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Feb 19 19:42:53 compute-0 nova_compute[186662]: 2026-02-19 19:42:53.753 186666 INFO nova.virt.libvirt.migration [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 19 19:42:54 compute-0 podman[214923]: 2026-02-19 19:42:54.270984104 +0000 UTC m=+0.041529380 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:42:54 compute-0 nova_compute[186662]: 2026-02-19 19:42:54.797 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:54 compute-0 kernel: tap41dd1a70-5d (unregistering): left promiscuous mode
Feb 19 19:42:54 compute-0 NetworkManager[56519]: <info>  [1771530174.8722] device (tap41dd1a70-5d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:42:54 compute-0 ovn_controller[96653]: 2026-02-19T19:42:54Z|00166|binding|INFO|Releasing lport 41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 from this chassis (sb_readonly=0)
Feb 19 19:42:54 compute-0 nova_compute[186662]: 2026-02-19 19:42:54.874 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:54 compute-0 ovn_controller[96653]: 2026-02-19T19:42:54Z|00167|binding|INFO|Setting lport 41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 down in Southbound
Feb 19 19:42:54 compute-0 ovn_controller[96653]: 2026-02-19T19:42:54Z|00168|binding|INFO|Removing iface tap41dd1a70-5d ovn-installed in OVS
Feb 19 19:42:54 compute-0 nova_compute[186662]: 2026-02-19 19:42:54.883 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:54 compute-0 nova_compute[186662]: 2026-02-19 19:42:54.897 186666 INFO nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 19 19:42:54 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Deactivated successfully.
Feb 19 19:42:54 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Consumed 13.505s CPU time.
Feb 19 19:42:54 compute-0 systemd-machined[156014]: Machine qemu-15-instance-00000015 terminated.
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.106 186666 DEBUG nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.108 186666 DEBUG nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.108 186666 DEBUG nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Feb 19 19:42:55 compute-0 ovn_controller[96653]: 2026-02-19T19:42:55Z|00169|binding|INFO|Releasing lport ff9c6a14-aac0-4622-8df4-fda6d4294002 from this chassis (sb_readonly=0)
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.123 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:b8:b3 10.100.0.8'], port_security=['fa:16:3e:44:b8:b3 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'd8481919-b10e-4218-b697-835a5c48ac63'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd28dc50a-3cd1-4dc1-98f1-384bc695e924', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-68c034ae-92c4-4919-9175-8992eee924de', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5cd356f35cb9438f878370c6e951113c', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'f2b8d61b-76c5-4a92-99b9-2ca57ab4c309', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db5091ff-cd1b-42d9-b277-948d5e6e25d3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=41dd1a70-5d3b-4180-b1cc-b01d51f10ae9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.124 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 in datapath 68c034ae-92c4-4919-9175-8992eee924de unbound from our chassis
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.124 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 68c034ae-92c4-4919-9175-8992eee924de, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.126 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e688cf0a-7087-41da-8498-dc2aee7411e2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.126 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-68c034ae-92c4-4919-9175-8992eee924de namespace which is not needed anymore
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.136 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:55 compute-0 neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de[214768]: [NOTICE]   (214772) : haproxy version is 3.0.5-8e879a5
Feb 19 19:42:55 compute-0 neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de[214768]: [NOTICE]   (214772) : path to executable is /usr/sbin/haproxy
Feb 19 19:42:55 compute-0 neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de[214768]: [WARNING]  (214772) : Exiting Master process...
Feb 19 19:42:55 compute-0 podman[214990]: 2026-02-19 19:42:55.219661451 +0000 UTC m=+0.026767018 container kill 85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:42:55 compute-0 neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de[214768]: [ALERT]    (214772) : Current worker (214774) exited with code 143 (Terminated)
Feb 19 19:42:55 compute-0 neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de[214768]: [WARNING]  (214772) : All workers exited. Exiting... (0)
Feb 19 19:42:55 compute-0 systemd[1]: libpod-85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1.scope: Deactivated successfully.
Feb 19 19:42:55 compute-0 podman[215005]: 2026-02-19 19:42:55.260292839 +0000 UTC m=+0.024405932 container died 85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.266 186666 DEBUG nova.compute.manager [req-af1cf520-0b6d-4829-b4f2-2f596cee396b req-22b773bd-46b3-4932-8d9d-b5735adf4f61 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-unplugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.266 186666 DEBUG oslo_concurrency.lockutils [req-af1cf520-0b6d-4829-b4f2-2f596cee396b req-22b773bd-46b3-4932-8d9d-b5735adf4f61 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.267 186666 DEBUG oslo_concurrency.lockutils [req-af1cf520-0b6d-4829-b4f2-2f596cee396b req-22b773bd-46b3-4932-8d9d-b5735adf4f61 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.267 186666 DEBUG oslo_concurrency.lockutils [req-af1cf520-0b6d-4829-b4f2-2f596cee396b req-22b773bd-46b3-4932-8d9d-b5735adf4f61 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.268 186666 DEBUG nova.compute.manager [req-af1cf520-0b6d-4829-b4f2-2f596cee396b req-22b773bd-46b3-4932-8d9d-b5735adf4f61 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] No waiting events found dispatching network-vif-unplugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.268 186666 DEBUG nova.compute.manager [req-af1cf520-0b6d-4829-b4f2-2f596cee396b req-22b773bd-46b3-4932-8d9d-b5735adf4f61 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-unplugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:42:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1-userdata-shm.mount: Deactivated successfully.
Feb 19 19:42:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-affa1aa7c0c6fad315695633bf7ce7cf0a86aae5a0eae8afc79d1fd676a427bf-merged.mount: Deactivated successfully.
Feb 19 19:42:55 compute-0 podman[215005]: 2026-02-19 19:42:55.378404182 +0000 UTC m=+0.142517235 container cleanup 85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 19 19:42:55 compute-0 systemd[1]: libpod-conmon-85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1.scope: Deactivated successfully.
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.400 186666 DEBUG nova.virt.libvirt.guest [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'd28dc50a-3cd1-4dc1-98f1-384bc695e924' (instance-00000015) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.400 186666 INFO nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Migration operation has completed
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.401 186666 INFO nova.compute.manager [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] _post_live_migration() is started..
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.437 186666 WARNING neutronclient.v2_0.client [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.437 186666 WARNING neutronclient.v2_0.client [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:42:55 compute-0 podman[215021]: 2026-02-19 19:42:55.483480733 +0000 UTC m=+0.199948962 container remove 85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.487 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a6cdd00d-1032-4661-9ef9-4b3a9a96f3e3]: (4, ("Thu Feb 19 07:42:55 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de (85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1)\n85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1\nThu Feb 19 07:42:55 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-68c034ae-92c4-4919-9175-8992eee924de (85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1)\n85d2fd84b6443954db64ffbdc8780410956657bddf9bf55e6906ce0bb5c777a1\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.488 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[cdff8adb-0dd8-4938-9d9e-e7199de93745]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.489 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/68c034ae-92c4-4919-9175-8992eee924de.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/68c034ae-92c4-4919-9175-8992eee924de.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.489 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[893a031b-cbbf-43cf-b6d1-417084f0406c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.490 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68c034ae-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.492 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:55 compute-0 kernel: tap68c034ae-90: left promiscuous mode
Feb 19 19:42:55 compute-0 nova_compute[186662]: 2026-02-19 19:42:55.505 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.509 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e416b90b-debb-40da-a149-5daba28c7e99]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.530 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[0945aab0-b1ae-47b6-b19c-ae3816657505]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.530 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e2bf3189-c406-41ff-a0fd-e1851dc9f127]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.541 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[5e3bb409-5983-4adf-8c26-0eaf969a4d7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439330, 'reachable_time': 15962, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215039, 'error': None, 'target': 'ovnmeta-68c034ae-92c4-4919-9175-8992eee924de', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.543 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-68c034ae-92c4-4919-9175-8992eee924de deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:42:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:55.543 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[735a780e-5769-4e34-922c-5b8fc52d556e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:42:55 compute-0 systemd[1]: run-netns-ovnmeta\x2d68c034ae\x2d92c4\x2d4919\x2d9175\x2d8992eee924de.mount: Deactivated successfully.
Feb 19 19:42:56 compute-0 nova_compute[186662]: 2026-02-19 19:42:56.052 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.216 186666 DEBUG nova.network.neutron [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Activated binding for port 41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.217 186666 DEBUG nova.compute.manager [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.217 186666 DEBUG nova.virt.libvirt.vif [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:41:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-755197348',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-755197348',id=21,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:42:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5cd356f35cb9438f878370c6e951113c',ramdisk_id='',reservation_id='r-1bjy2ija',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-695507613',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-695507613-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:42:31Z,user_data=None,user_id='96e4d16f3a064f518ae335f4a1dc7345',uuid=d28dc50a-3cd1-4dc1-98f1-384bc695e924,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.217 186666 DEBUG nova.network.os_vif_util [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "address": "fa:16:3e:44:b8:b3", "network": {"id": "68c034ae-92c4-4919-9175-8992eee924de", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-358018379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72833255ccf64eb28a4dd1a5d46b8625", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41dd1a70-5d", "ovs_interfaceid": "41dd1a70-5d3b-4180-b1cc-b01d51f10ae9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.218 186666 DEBUG nova.network.os_vif_util [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:b8:b3,bridge_name='br-int',has_traffic_filtering=True,id=41dd1a70-5d3b-4180-b1cc-b01d51f10ae9,network=Network(68c034ae-92c4-4919-9175-8992eee924de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41dd1a70-5d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.218 186666 DEBUG os_vif [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:b8:b3,bridge_name='br-int',has_traffic_filtering=True,id=41dd1a70-5d3b-4180-b1cc-b01d51f10ae9,network=Network(68c034ae-92c4-4919-9175-8992eee924de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41dd1a70-5d') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.220 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.220 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41dd1a70-5d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.221 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.222 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.223 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.223 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a7a4ccc8-8c96-4bec-9198-090391c80143) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.223 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.224 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.226 186666 INFO os_vif [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:b8:b3,bridge_name='br-int',has_traffic_filtering=True,id=41dd1a70-5d3b-4180-b1cc-b01d51f10ae9,network=Network(68c034ae-92c4-4919-9175-8992eee924de),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41dd1a70-5d')
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.226 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.227 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.227 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.227 186666 DEBUG nova.compute.manager [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.227 186666 INFO nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Deleting instance files /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924_del
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.228 186666 INFO nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Deletion of /var/lib/nova/instances/d28dc50a-3cd1-4dc1-98f1-384bc695e924_del complete
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.337 186666 DEBUG nova.compute.manager [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.337 186666 DEBUG oslo_concurrency.lockutils [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.337 186666 DEBUG oslo_concurrency.lockutils [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.337 186666 DEBUG oslo_concurrency.lockutils [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.337 186666 DEBUG nova.compute.manager [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] No waiting events found dispatching network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.337 186666 WARNING nova.compute.manager [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received unexpected event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 for instance with vm_state active and task_state migrating.
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.338 186666 DEBUG nova.compute.manager [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-unplugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.338 186666 DEBUG oslo_concurrency.lockutils [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.338 186666 DEBUG oslo_concurrency.lockutils [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.338 186666 DEBUG oslo_concurrency.lockutils [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.338 186666 DEBUG nova.compute.manager [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] No waiting events found dispatching network-vif-unplugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.339 186666 DEBUG nova.compute.manager [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-unplugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.339 186666 DEBUG nova.compute.manager [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-unplugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.339 186666 DEBUG oslo_concurrency.lockutils [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.339 186666 DEBUG oslo_concurrency.lockutils [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.339 186666 DEBUG oslo_concurrency.lockutils [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.339 186666 DEBUG nova.compute.manager [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] No waiting events found dispatching network-vif-unplugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.340 186666 DEBUG nova.compute.manager [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-unplugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.340 186666 DEBUG nova.compute.manager [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.340 186666 DEBUG oslo_concurrency.lockutils [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.340 186666 DEBUG oslo_concurrency.lockutils [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.340 186666 DEBUG oslo_concurrency.lockutils [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.341 186666 DEBUG nova.compute.manager [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] No waiting events found dispatching network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:42:57 compute-0 nova_compute[186662]: 2026-02-19 19:42:57.341 186666 WARNING nova.compute.manager [req-59b79564-725c-4e01-a849-a40e205d0ef1 req-11a23451-5d0b-45a7-b832-2a2162aa80cf 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received unexpected event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 for instance with vm_state active and task_state migrating.
Feb 19 19:42:58 compute-0 podman[215040]: 2026-02-19 19:42:58.313420653 +0000 UTC m=+0.071224096 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 19 19:42:58 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:42:58.963 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:42:59 compute-0 nova_compute[186662]: 2026-02-19 19:42:59.446 186666 DEBUG nova.compute.manager [req-ecf303f2-c56e-4771-972a-cb4fefc24540 req-41c89220-ac87-4a0f-abf2-30696edb7d59 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:42:59 compute-0 nova_compute[186662]: 2026-02-19 19:42:59.447 186666 DEBUG oslo_concurrency.lockutils [req-ecf303f2-c56e-4771-972a-cb4fefc24540 req-41c89220-ac87-4a0f-abf2-30696edb7d59 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:42:59 compute-0 nova_compute[186662]: 2026-02-19 19:42:59.447 186666 DEBUG oslo_concurrency.lockutils [req-ecf303f2-c56e-4771-972a-cb4fefc24540 req-41c89220-ac87-4a0f-abf2-30696edb7d59 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:42:59 compute-0 nova_compute[186662]: 2026-02-19 19:42:59.448 186666 DEBUG oslo_concurrency.lockutils [req-ecf303f2-c56e-4771-972a-cb4fefc24540 req-41c89220-ac87-4a0f-abf2-30696edb7d59 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:42:59 compute-0 nova_compute[186662]: 2026-02-19 19:42:59.448 186666 DEBUG nova.compute.manager [req-ecf303f2-c56e-4771-972a-cb4fefc24540 req-41c89220-ac87-4a0f-abf2-30696edb7d59 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] No waiting events found dispatching network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:42:59 compute-0 nova_compute[186662]: 2026-02-19 19:42:59.448 186666 WARNING nova.compute.manager [req-ecf303f2-c56e-4771-972a-cb4fefc24540 req-41c89220-ac87-4a0f-abf2-30696edb7d59 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Received unexpected event network-vif-plugged-41dd1a70-5d3b-4180-b1cc-b01d51f10ae9 for instance with vm_state active and task_state migrating.
Feb 19 19:42:59 compute-0 sshd-session[215055]: Invalid user oracle from 103.67.78.251 port 45688
Feb 19 19:42:59 compute-0 podman[196025]: time="2026-02-19T19:42:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:42:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:42:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:42:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:42:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Feb 19 19:42:59 compute-0 nova_compute[186662]: 2026-02-19 19:42:59.799 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:42:59 compute-0 sshd-session[215055]: Received disconnect from 103.67.78.251 port 45688:11: Bye Bye [preauth]
Feb 19 19:42:59 compute-0 sshd-session[215055]: Disconnected from invalid user oracle 103.67.78.251 port 45688 [preauth]
Feb 19 19:43:01 compute-0 podman[215063]: 2026-02-19 19:43:01.283397826 +0000 UTC m=+0.064646740 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.43.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 19 19:43:01 compute-0 openstack_network_exporter[198916]: ERROR   19:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:43:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:43:01 compute-0 openstack_network_exporter[198916]: ERROR   19:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:43:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:43:02 compute-0 nova_compute[186662]: 2026-02-19 19:43:02.224 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:04 compute-0 nova_compute[186662]: 2026-02-19 19:43:04.800 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:05 compute-0 podman[215089]: 2026-02-19 19:43:05.25972315 +0000 UTC m=+0.041619462 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 19:43:06 compute-0 nova_compute[186662]: 2026-02-19 19:43:06.437 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:43:06 compute-0 nova_compute[186662]: 2026-02-19 19:43:06.438 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:43:06 compute-0 nova_compute[186662]: 2026-02-19 19:43:06.438 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "d28dc50a-3cd1-4dc1-98f1-384bc695e924-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:43:06 compute-0 nova_compute[186662]: 2026-02-19 19:43:06.950 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:43:06 compute-0 nova_compute[186662]: 2026-02-19 19:43:06.950 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:43:06 compute-0 nova_compute[186662]: 2026-02-19 19:43:06.950 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:43:06 compute-0 nova_compute[186662]: 2026-02-19 19:43:06.950 186666 DEBUG nova.compute.resource_tracker [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:43:07 compute-0 nova_compute[186662]: 2026-02-19 19:43:07.074 186666 WARNING nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:43:07 compute-0 nova_compute[186662]: 2026-02-19 19:43:07.075 186666 DEBUG oslo_concurrency.processutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:43:07 compute-0 nova_compute[186662]: 2026-02-19 19:43:07.089 186666 DEBUG oslo_concurrency.processutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:43:07 compute-0 nova_compute[186662]: 2026-02-19 19:43:07.090 186666 DEBUG nova.compute.resource_tracker [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5833MB free_disk=72.97611999511719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:43:07 compute-0 nova_compute[186662]: 2026-02-19 19:43:07.090 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:43:07 compute-0 nova_compute[186662]: 2026-02-19 19:43:07.090 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:43:07 compute-0 nova_compute[186662]: 2026-02-19 19:43:07.227 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:08 compute-0 nova_compute[186662]: 2026-02-19 19:43:08.106 186666 DEBUG nova.compute.resource_tracker [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Migration for instance d28dc50a-3cd1-4dc1-98f1-384bc695e924 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Feb 19 19:43:08 compute-0 nova_compute[186662]: 2026-02-19 19:43:08.613 186666 DEBUG nova.compute.resource_tracker [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Feb 19 19:43:08 compute-0 nova_compute[186662]: 2026-02-19 19:43:08.634 186666 DEBUG nova.compute.resource_tracker [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Migration a6ff9943-ba78-421e-8dfb-404f3e720229 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Feb 19 19:43:08 compute-0 nova_compute[186662]: 2026-02-19 19:43:08.635 186666 DEBUG nova.compute.resource_tracker [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:43:08 compute-0 nova_compute[186662]: 2026-02-19 19:43:08.635 186666 DEBUG nova.compute.resource_tracker [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:43:07 up  1:14,  0 user,  load average: 0.28, 0.29, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:43:08 compute-0 nova_compute[186662]: 2026-02-19 19:43:08.671 186666 DEBUG nova.compute.provider_tree [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:43:09 compute-0 nova_compute[186662]: 2026-02-19 19:43:09.179 186666 DEBUG nova.scheduler.client.report [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:43:09 compute-0 nova_compute[186662]: 2026-02-19 19:43:09.690 186666 DEBUG nova.compute.resource_tracker [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:43:09 compute-0 nova_compute[186662]: 2026-02-19 19:43:09.690 186666 DEBUG oslo_concurrency.lockutils [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.600s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:43:09 compute-0 nova_compute[186662]: 2026-02-19 19:43:09.704 186666 INFO nova.compute.manager [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 19 19:43:09 compute-0 nova_compute[186662]: 2026-02-19 19:43:09.802 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:10 compute-0 nova_compute[186662]: 2026-02-19 19:43:10.757 186666 INFO nova.scheduler.client.report [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Deleted allocation for migration a6ff9943-ba78-421e-8dfb-404f3e720229
Feb 19 19:43:10 compute-0 nova_compute[186662]: 2026-02-19 19:43:10.757 186666 DEBUG nova.virt.libvirt.driver [None req-c21a7293-a3e6-43e3-900e-5c038c01a1fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: d28dc50a-3cd1-4dc1-98f1-384bc695e924] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Feb 19 19:43:12 compute-0 nova_compute[186662]: 2026-02-19 19:43:12.230 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:14 compute-0 nova_compute[186662]: 2026-02-19 19:43:14.804 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:17 compute-0 nova_compute[186662]: 2026-02-19 19:43:17.233 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:19 compute-0 nova_compute[186662]: 2026-02-19 19:43:19.806 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:21 compute-0 nova_compute[186662]: 2026-02-19 19:43:21.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:43:22 compute-0 nova_compute[186662]: 2026-02-19 19:43:22.235 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:22 compute-0 nova_compute[186662]: 2026-02-19 19:43:22.452 186666 DEBUG nova.compute.manager [None req-d0b07985-4da8-4a45-ad4f-9f940a6b29eb b61936f8600641abb9e2d5787407b4b1 084bf37190834c4d9a8f0459d9d05ec7 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Feb 19 19:43:22 compute-0 nova_compute[186662]: 2026-02-19 19:43:22.496 186666 DEBUG nova.compute.provider_tree [None req-d0b07985-4da8-4a45-ad4f-9f940a6b29eb b61936f8600641abb9e2d5787407b4b1 084bf37190834c4d9a8f0459d9d05ec7 - - default default] Updating resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 generation from 31 to 34 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Feb 19 19:43:23 compute-0 nova_compute[186662]: 2026-02-19 19:43:23.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:43:24 compute-0 nova_compute[186662]: 2026-02-19 19:43:24.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:43:24 compute-0 nova_compute[186662]: 2026-02-19 19:43:24.806 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:25 compute-0 podman[215116]: 2026-02-19 19:43:25.29924742 +0000 UTC m=+0.079453826 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 19 19:43:25 compute-0 nova_compute[186662]: 2026-02-19 19:43:25.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:43:26 compute-0 nova_compute[186662]: 2026-02-19 19:43:26.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:43:26 compute-0 nova_compute[186662]: 2026-02-19 19:43:26.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:43:27 compute-0 nova_compute[186662]: 2026-02-19 19:43:27.237 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:28 compute-0 nova_compute[186662]: 2026-02-19 19:43:28.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:43:29 compute-0 podman[215135]: 2026-02-19 19:43:29.272433849 +0000 UTC m=+0.052857871 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, container_name=openstack_network_exporter, version=9.7, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-type=git, architecture=x86_64)
Feb 19 19:43:29 compute-0 nova_compute[186662]: 2026-02-19 19:43:29.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:43:29 compute-0 podman[196025]: time="2026-02-19T19:43:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:43:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:43:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:43:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:43:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2191 "" "Go-http-client/1.1"
Feb 19 19:43:29 compute-0 nova_compute[186662]: 2026-02-19 19:43:29.807 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:30 compute-0 nova_compute[186662]: 2026-02-19 19:43:30.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:43:30 compute-0 nova_compute[186662]: 2026-02-19 19:43:30.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:43:30 compute-0 nova_compute[186662]: 2026-02-19 19:43:30.087 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:43:30 compute-0 nova_compute[186662]: 2026-02-19 19:43:30.087 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:43:30 compute-0 nova_compute[186662]: 2026-02-19 19:43:30.196 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:43:30 compute-0 nova_compute[186662]: 2026-02-19 19:43:30.197 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:43:30 compute-0 nova_compute[186662]: 2026-02-19 19:43:30.208 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.011s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:43:30 compute-0 nova_compute[186662]: 2026-02-19 19:43:30.208 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5849MB free_disk=72.97610092163086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:43:30 compute-0 nova_compute[186662]: 2026-02-19 19:43:30.209 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:43:30 compute-0 nova_compute[186662]: 2026-02-19 19:43:30.209 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:43:31 compute-0 nova_compute[186662]: 2026-02-19 19:43:31.391 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:43:31 compute-0 nova_compute[186662]: 2026-02-19 19:43:31.392 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:43:30 up  1:14,  0 user,  load average: 0.20, 0.27, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:43:31 compute-0 nova_compute[186662]: 2026-02-19 19:43:31.419 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:43:31 compute-0 openstack_network_exporter[198916]: ERROR   19:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:43:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:43:31 compute-0 openstack_network_exporter[198916]: ERROR   19:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:43:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:43:31 compute-0 nova_compute[186662]: 2026-02-19 19:43:31.926 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:43:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:32.147 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:43:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:32.147 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:43:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:32.147 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:43:32 compute-0 nova_compute[186662]: 2026-02-19 19:43:32.239 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:32 compute-0 podman[215160]: 2026-02-19 19:43:32.327901989 +0000 UTC m=+0.101729156 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260216)
Feb 19 19:43:32 compute-0 nova_compute[186662]: 2026-02-19 19:43:32.435 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:43:32 compute-0 nova_compute[186662]: 2026-02-19 19:43:32.435 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.226s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:43:34 compute-0 nova_compute[186662]: 2026-02-19 19:43:34.430 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:43:34 compute-0 nova_compute[186662]: 2026-02-19 19:43:34.808 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:34 compute-0 nova_compute[186662]: 2026-02-19 19:43:34.938 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:43:36 compute-0 podman[215184]: 2026-02-19 19:43:36.252569793 +0000 UTC m=+0.034675811 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 19:43:37 compute-0 nova_compute[186662]: 2026-02-19 19:43:37.242 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:39 compute-0 nova_compute[186662]: 2026-02-19 19:43:39.811 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:40.038 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:91:5e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '494c6c3df70542bd8ad63a3ad2241fe8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a41cc354-2c45-4738-99b1-e4951b7f67ae, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=35fb9fb2-2f23-4514-91e1-a3a8a8349d32) old=Port_Binding(mac=['fa:16:3e:f5:91:5e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '494c6c3df70542bd8ad63a3ad2241fe8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:43:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:40.039 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 35fb9fb2-2f23-4514-91e1-a3a8a8349d32 in datapath c593f4fa-8caf-4204-a168-7d36dea7afd9 updated
Feb 19 19:43:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:40.039 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c593f4fa-8caf-4204-a168-7d36dea7afd9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:43:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:40.040 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[43db3642-01bf-4fe8-9b92-966589c30f8d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:43:42 compute-0 nova_compute[186662]: 2026-02-19 19:43:42.244 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:44 compute-0 nova_compute[186662]: 2026-02-19 19:43:44.868 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:47 compute-0 nova_compute[186662]: 2026-02-19 19:43:47.247 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:49 compute-0 nova_compute[186662]: 2026-02-19 19:43:49.869 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:50.239 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:f5:33 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6db303e4-b2fe-49af-9ab4-7a6a3c850a68', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db303e4-b2fe-49af-9ab4-7a6a3c850a68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e7e566fe9744481b8016f2a804a68c2b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46cdf4b9-af7d-4bcb-87e9-f5b75d9569d3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef6940a9-466f-4d5a-8ace-487e169bbe42) old=Port_Binding(mac=['fa:16:3e:b8:f5:33'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-6db303e4-b2fe-49af-9ab4-7a6a3c850a68', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db303e4-b2fe-49af-9ab4-7a6a3c850a68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e7e566fe9744481b8016f2a804a68c2b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:43:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:50.240 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef6940a9-466f-4d5a-8ace-487e169bbe42 in datapath 6db303e4-b2fe-49af-9ab4-7a6a3c850a68 updated
Feb 19 19:43:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:50.240 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db303e4-b2fe-49af-9ab4-7a6a3c850a68, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:43:50 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:50.241 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a43863e5-7871-4eb8-a727-dca72da84425]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:43:51 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:51.100 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:43:51 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:51.100 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:43:51 compute-0 nova_compute[186662]: 2026-02-19 19:43:51.102 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:52 compute-0 nova_compute[186662]: 2026-02-19 19:43:52.250 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:54 compute-0 nova_compute[186662]: 2026-02-19 19:43:54.872 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:56 compute-0 podman[215210]: 2026-02-19 19:43:56.267000192 +0000 UTC m=+0.043843152 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 19 19:43:57 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:43:57.102 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:43:57 compute-0 nova_compute[186662]: 2026-02-19 19:43:57.256 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:43:58 compute-0 ovn_controller[96653]: 2026-02-19T19:43:58Z|00170|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 19 19:43:59 compute-0 podman[196025]: time="2026-02-19T19:43:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:43:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:43:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:43:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:43:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Feb 19 19:43:59 compute-0 nova_compute[186662]: 2026-02-19 19:43:59.878 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:00 compute-0 podman[215229]: 2026-02-19 19:44:00.262716608 +0000 UTC m=+0.040055741 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, architecture=x86_64, version=9.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z)
Feb 19 19:44:01 compute-0 openstack_network_exporter[198916]: ERROR   19:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:44:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:44:01 compute-0 openstack_network_exporter[198916]: ERROR   19:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:44:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:44:02 compute-0 nova_compute[186662]: 2026-02-19 19:44:02.263 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:03 compute-0 podman[215250]: 2026-02-19 19:44:03.342084677 +0000 UTC m=+0.117072746 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 19 19:44:04 compute-0 nova_compute[186662]: 2026-02-19 19:44:04.881 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:07 compute-0 nova_compute[186662]: 2026-02-19 19:44:07.266 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:07 compute-0 podman[215276]: 2026-02-19 19:44:07.274563341 +0000 UTC m=+0.053137368 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:44:09 compute-0 nova_compute[186662]: 2026-02-19 19:44:09.882 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:12 compute-0 nova_compute[186662]: 2026-02-19 19:44:12.269 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:14 compute-0 nova_compute[186662]: 2026-02-19 19:44:14.884 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:17 compute-0 sshd-session[215302]: Invalid user dixi from 96.78.175.42 port 43874
Feb 19 19:44:17 compute-0 sshd-session[215302]: Received disconnect from 96.78.175.42 port 43874:11: Bye Bye [preauth]
Feb 19 19:44:17 compute-0 sshd-session[215302]: Disconnected from invalid user dixi 96.78.175.42 port 43874 [preauth]
Feb 19 19:44:17 compute-0 nova_compute[186662]: 2026-02-19 19:44:17.272 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:19 compute-0 nova_compute[186662]: 2026-02-19 19:44:19.887 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:21 compute-0 nova_compute[186662]: 2026-02-19 19:44:21.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:44:22 compute-0 nova_compute[186662]: 2026-02-19 19:44:22.276 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:23 compute-0 nova_compute[186662]: 2026-02-19 19:44:23.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:44:24 compute-0 nova_compute[186662]: 2026-02-19 19:44:24.888 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:26 compute-0 nova_compute[186662]: 2026-02-19 19:44:26.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:44:27 compute-0 nova_compute[186662]: 2026-02-19 19:44:27.279 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:27 compute-0 podman[215304]: 2026-02-19 19:44:27.304286022 +0000 UTC m=+0.063255624 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 19 19:44:27 compute-0 nova_compute[186662]: 2026-02-19 19:44:27.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:44:28 compute-0 nova_compute[186662]: 2026-02-19 19:44:28.578 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:44:28 compute-0 nova_compute[186662]: 2026-02-19 19:44:28.579 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:44:29 compute-0 nova_compute[186662]: 2026-02-19 19:44:29.582 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:44:29 compute-0 podman[196025]: time="2026-02-19T19:44:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:44:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:44:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:44:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:44:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Feb 19 19:44:29 compute-0 nova_compute[186662]: 2026-02-19 19:44:29.932 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:30 compute-0 nova_compute[186662]: 2026-02-19 19:44:30.095 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:44:30 compute-0 nova_compute[186662]: 2026-02-19 19:44:30.096 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:44:30 compute-0 nova_compute[186662]: 2026-02-19 19:44:30.096 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:44:30 compute-0 nova_compute[186662]: 2026-02-19 19:44:30.096 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:44:30 compute-0 nova_compute[186662]: 2026-02-19 19:44:30.241 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:44:30 compute-0 nova_compute[186662]: 2026-02-19 19:44:30.242 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:44:30 compute-0 nova_compute[186662]: 2026-02-19 19:44:30.260 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:44:30 compute-0 nova_compute[186662]: 2026-02-19 19:44:30.261 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5840MB free_disk=72.97555160522461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:44:30 compute-0 nova_compute[186662]: 2026-02-19 19:44:30.261 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:44:30 compute-0 nova_compute[186662]: 2026-02-19 19:44:30.261 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:44:31 compute-0 nova_compute[186662]: 2026-02-19 19:44:31.305 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:44:31 compute-0 nova_compute[186662]: 2026-02-19 19:44:31.306 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:44:30 up  1:15,  0 user,  load average: 0.07, 0.22, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:44:31 compute-0 podman[215324]: 2026-02-19 19:44:31.311416035 +0000 UTC m=+0.074991808 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.7, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container)
Feb 19 19:44:31 compute-0 nova_compute[186662]: 2026-02-19 19:44:31.328 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing inventories for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Feb 19 19:44:31 compute-0 nova_compute[186662]: 2026-02-19 19:44:31.349 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating ProviderTree inventory for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Feb 19 19:44:31 compute-0 nova_compute[186662]: 2026-02-19 19:44:31.349 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:44:31 compute-0 nova_compute[186662]: 2026-02-19 19:44:31.368 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing aggregate associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Feb 19 19:44:31 compute-0 nova_compute[186662]: 2026-02-19 19:44:31.386 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing trait associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_SSE2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,HW_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_USB _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Feb 19 19:44:31 compute-0 nova_compute[186662]: 2026-02-19 19:44:31.414 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:44:31 compute-0 openstack_network_exporter[198916]: ERROR   19:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:44:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:44:31 compute-0 openstack_network_exporter[198916]: ERROR   19:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:44:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:44:31 compute-0 nova_compute[186662]: 2026-02-19 19:44:31.924 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:44:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:44:32.148 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:44:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:44:32.149 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:44:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:44:32.149 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:44:32 compute-0 nova_compute[186662]: 2026-02-19 19:44:32.282 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:32 compute-0 nova_compute[186662]: 2026-02-19 19:44:32.433 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:44:32 compute-0 nova_compute[186662]: 2026-02-19 19:44:32.433 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.172s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:44:33 compute-0 nova_compute[186662]: 2026-02-19 19:44:33.427 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:44:33 compute-0 nova_compute[186662]: 2026-02-19 19:44:33.428 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:44:33 compute-0 sshd-session[215346]: Invalid user mysql from 106.51.64.128 port 50916
Feb 19 19:44:33 compute-0 podman[215348]: 2026-02-19 19:44:33.646222469 +0000 UTC m=+0.084360074 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 19 19:44:33 compute-0 sshd-session[215346]: Received disconnect from 106.51.64.128 port 50916:11: Bye Bye [preauth]
Feb 19 19:44:33 compute-0 sshd-session[215346]: Disconnected from invalid user mysql 106.51.64.128 port 50916 [preauth]
Feb 19 19:44:34 compute-0 nova_compute[186662]: 2026-02-19 19:44:34.935 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:37 compute-0 nova_compute[186662]: 2026-02-19 19:44:37.287 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:38 compute-0 podman[215374]: 2026-02-19 19:44:38.30226348 +0000 UTC m=+0.066164284 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 19:44:39 compute-0 nova_compute[186662]: 2026-02-19 19:44:39.938 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:42 compute-0 nova_compute[186662]: 2026-02-19 19:44:42.291 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:44 compute-0 nova_compute[186662]: 2026-02-19 19:44:44.940 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:44 compute-0 sshd-session[215398]: Invalid user manager from 45.169.200.254 port 40194
Feb 19 19:44:45 compute-0 sshd-session[215398]: Received disconnect from 45.169.200.254 port 40194:11: Bye Bye [preauth]
Feb 19 19:44:45 compute-0 sshd-session[215398]: Disconnected from invalid user manager 45.169.200.254 port 40194 [preauth]
Feb 19 19:44:47 compute-0 nova_compute[186662]: 2026-02-19 19:44:47.294 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:49 compute-0 nova_compute[186662]: 2026-02-19 19:44:49.946 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:51 compute-0 sshd-session[215400]: Invalid user nutanix from 197.211.55.20 port 46166
Feb 19 19:44:51 compute-0 sshd-session[215400]: Received disconnect from 197.211.55.20 port 46166:11: Bye Bye [preauth]
Feb 19 19:44:51 compute-0 sshd-session[215400]: Disconnected from invalid user nutanix 197.211.55.20 port 46166 [preauth]
Feb 19 19:44:52 compute-0 nova_compute[186662]: 2026-02-19 19:44:52.298 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:52 compute-0 nova_compute[186662]: 2026-02-19 19:44:52.706 186666 DEBUG nova.virt.libvirt.driver [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Creating tmpfile /var/lib/nova/instances/tmp2dd7idmo to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Feb 19 19:44:52 compute-0 nova_compute[186662]: 2026-02-19 19:44:52.707 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:44:52 compute-0 nova_compute[186662]: 2026-02-19 19:44:52.896 186666 DEBUG nova.compute.manager [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2dd7idmo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Feb 19 19:44:55 compute-0 nova_compute[186662]: 2026-02-19 19:44:55.674 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:55 compute-0 nova_compute[186662]: 2026-02-19 19:44:55.677 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:44:57 compute-0 nova_compute[186662]: 2026-02-19 19:44:57.301 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:44:58 compute-0 podman[215402]: 2026-02-19 19:44:58.294548852 +0000 UTC m=+0.062587467 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 19 19:44:59 compute-0 podman[196025]: time="2026-02-19T19:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:44:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:44:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2199 "" "Go-http-client/1.1"
Feb 19 19:44:59 compute-0 nova_compute[186662]: 2026-02-19 19:44:59.928 186666 DEBUG nova.compute.manager [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2dd7idmo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='98af52ad-2964-4557-85f1-eb0343d8f085',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Feb 19 19:45:00 compute-0 nova_compute[186662]: 2026-02-19 19:45:00.679 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:00 compute-0 nova_compute[186662]: 2026-02-19 19:45:00.943 186666 DEBUG oslo_concurrency.lockutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-98af52ad-2964-4557-85f1-eb0343d8f085" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:45:00 compute-0 nova_compute[186662]: 2026-02-19 19:45:00.943 186666 DEBUG oslo_concurrency.lockutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-98af52ad-2964-4557-85f1-eb0343d8f085" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:45:00 compute-0 nova_compute[186662]: 2026-02-19 19:45:00.943 186666 DEBUG nova.network.neutron [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:45:01 compute-0 openstack_network_exporter[198916]: ERROR   19:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:45:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:45:01 compute-0 openstack_network_exporter[198916]: ERROR   19:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:45:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:45:01 compute-0 nova_compute[186662]: 2026-02-19 19:45:01.452 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:01 compute-0 nova_compute[186662]: 2026-02-19 19:45:01.828 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:02 compute-0 nova_compute[186662]: 2026-02-19 19:45:02.072 186666 DEBUG nova.network.neutron [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Updating instance_info_cache with network_info: [{"id": "4adba740-03c5-45a4-8b6f-3cffa5076199", "address": "fa:16:3e:d2:5a:0c", "network": {"id": "c593f4fa-8caf-4204-a168-7d36dea7afd9", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-530295078-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "494c6c3df70542bd8ad63a3ad2241fe8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4adba740-03", "ovs_interfaceid": "4adba740-03c5-45a4-8b6f-3cffa5076199", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:45:02 compute-0 podman[215421]: 2026-02-19 19:45:02.292736298 +0000 UTC m=+0.073661026 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, name=ubi9/ubi-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Feb 19 19:45:02 compute-0 nova_compute[186662]: 2026-02-19 19:45:02.303 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:02 compute-0 nova_compute[186662]: 2026-02-19 19:45:02.577 186666 DEBUG oslo_concurrency.lockutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-98af52ad-2964-4557-85f1-eb0343d8f085" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:45:02 compute-0 nova_compute[186662]: 2026-02-19 19:45:02.591 186666 DEBUG nova.virt.libvirt.driver [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2dd7idmo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='98af52ad-2964-4557-85f1-eb0343d8f085',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Feb 19 19:45:02 compute-0 nova_compute[186662]: 2026-02-19 19:45:02.592 186666 DEBUG nova.virt.libvirt.driver [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Creating instance directory: /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Feb 19 19:45:02 compute-0 nova_compute[186662]: 2026-02-19 19:45:02.592 186666 DEBUG nova.virt.libvirt.driver [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Creating disk.info with the contents: {'/var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk': 'qcow2', '/var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Feb 19 19:45:02 compute-0 nova_compute[186662]: 2026-02-19 19:45:02.592 186666 DEBUG nova.virt.libvirt.driver [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Feb 19 19:45:02 compute-0 nova_compute[186662]: 2026-02-19 19:45:02.593 186666 DEBUG nova.objects.instance [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 98af52ad-2964-4557-85f1-eb0343d8f085 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.100 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.103 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.104 186666 DEBUG oslo_concurrency.processutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.166 186666 DEBUG oslo_concurrency.processutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.166 186666 DEBUG oslo_concurrency.lockutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.167 186666 DEBUG oslo_concurrency.lockutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.167 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.171 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.171 186666 DEBUG oslo_concurrency.processutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.212 186666 DEBUG oslo_concurrency.processutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.213 186666 DEBUG oslo_concurrency.processutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.237 186666 DEBUG oslo_concurrency.processutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.238 186666 DEBUG oslo_concurrency.lockutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.071s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.238 186666 DEBUG oslo_concurrency.processutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.292 186666 DEBUG oslo_concurrency.processutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.293 186666 DEBUG nova.virt.disk.api [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Checking if we can resize image /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.293 186666 DEBUG oslo_concurrency.processutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.357 186666 DEBUG oslo_concurrency.processutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.358 186666 DEBUG nova.virt.disk.api [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Cannot resize image /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.358 186666 DEBUG nova.objects.instance [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid 98af52ad-2964-4557-85f1-eb0343d8f085 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.866 186666 DEBUG nova.objects.base [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<98af52ad-2964-4557-85f1-eb0343d8f085> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.867 186666 DEBUG oslo_concurrency.processutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.883 186666 DEBUG oslo_concurrency.processutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk.config 497664" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.885 186666 DEBUG nova.virt.libvirt.driver [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.887 186666 DEBUG nova.virt.libvirt.vif [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-02-19T19:44:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-339461723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-339461723',id=22,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:44:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e7e566fe9744481b8016f2a804a68c2b',ramdisk_id='',reservation_id='r-acjlae8w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-285058964',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-285058964-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:44:15Z,user_data=None,user_id='eb789bc56fc941ed8761873b0815c588',uuid=98af52ad-2964-4557-85f1-eb0343d8f085,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4adba740-03c5-45a4-8b6f-3cffa5076199", "address": "fa:16:3e:d2:5a:0c", "network": {"id": "c593f4fa-8caf-4204-a168-7d36dea7afd9", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-530295078-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "494c6c3df70542bd8ad63a3ad2241fe8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4adba740-03", "ovs_interfaceid": "4adba740-03c5-45a4-8b6f-3cffa5076199", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.888 186666 DEBUG nova.network.os_vif_util [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "4adba740-03c5-45a4-8b6f-3cffa5076199", "address": "fa:16:3e:d2:5a:0c", "network": {"id": "c593f4fa-8caf-4204-a168-7d36dea7afd9", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-530295078-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "494c6c3df70542bd8ad63a3ad2241fe8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4adba740-03", "ovs_interfaceid": "4adba740-03c5-45a4-8b6f-3cffa5076199", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.889 186666 DEBUG nova.network.os_vif_util [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:5a:0c,bridge_name='br-int',has_traffic_filtering=True,id=4adba740-03c5-45a4-8b6f-3cffa5076199,network=Network(c593f4fa-8caf-4204-a168-7d36dea7afd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4adba740-03') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.890 186666 DEBUG os_vif [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:5a:0c,bridge_name='br-int',has_traffic_filtering=True,id=4adba740-03c5-45a4-8b6f-3cffa5076199,network=Network(c593f4fa-8caf-4204-a168-7d36dea7afd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4adba740-03') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.891 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.892 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.892 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.894 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.895 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9201eb9d-bee4-57a2-9911-9fbcb1654039', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.897 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.899 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.902 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.903 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4adba740-03, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.904 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap4adba740-03, col_values=(('qos', UUID('ad6aa918-426b-488b-9790-f349a583684f')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.904 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap4adba740-03, col_values=(('external_ids', {'iface-id': '4adba740-03c5-45a4-8b6f-3cffa5076199', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d2:5a:0c', 'vm-uuid': '98af52ad-2964-4557-85f1-eb0343d8f085'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.906 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:03 compute-0 NetworkManager[56519]: <info>  [1771530303.9067] manager: (tap4adba740-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.909 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.910 186666 INFO os_vif [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:5a:0c,bridge_name='br-int',has_traffic_filtering=True,id=4adba740-03c5-45a4-8b6f-3cffa5076199,network=Network(c593f4fa-8caf-4204-a168-7d36dea7afd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4adba740-03')
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.911 186666 DEBUG nova.virt.libvirt.driver [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.912 186666 DEBUG nova.compute.manager [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2dd7idmo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='98af52ad-2964-4557-85f1-eb0343d8f085',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Feb 19 19:45:03 compute-0 nova_compute[186662]: 2026-02-19 19:45:03.913 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:04 compute-0 podman[215464]: 2026-02-19 19:45:04.330683553 +0000 UTC m=+0.108188053 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Feb 19 19:45:04 compute-0 nova_compute[186662]: 2026-02-19 19:45:04.992 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:05.316 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:45:05 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:05.317 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:45:05 compute-0 nova_compute[186662]: 2026-02-19 19:45:05.319 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:05 compute-0 nova_compute[186662]: 2026-02-19 19:45:05.663 186666 DEBUG nova.network.neutron [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Port 4adba740-03c5-45a4-8b6f-3cffa5076199 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Feb 19 19:45:05 compute-0 nova_compute[186662]: 2026-02-19 19:45:05.674 186666 DEBUG nova.compute.manager [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2dd7idmo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='98af52ad-2964-4557-85f1-eb0343d8f085',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Feb 19 19:45:05 compute-0 nova_compute[186662]: 2026-02-19 19:45:05.679 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:08 compute-0 nova_compute[186662]: 2026-02-19 19:45:08.936 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:09 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 19 19:45:09 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 19 19:45:09 compute-0 podman[215491]: 2026-02-19 19:45:09.082328928 +0000 UTC m=+0.048775832 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:45:09 compute-0 kernel: tap4adba740-03: entered promiscuous mode
Feb 19 19:45:09 compute-0 NetworkManager[56519]: <info>  [1771530309.1842] manager: (tap4adba740-03): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Feb 19 19:45:09 compute-0 ovn_controller[96653]: 2026-02-19T19:45:09Z|00171|binding|INFO|Claiming lport 4adba740-03c5-45a4-8b6f-3cffa5076199 for this additional chassis.
Feb 19 19:45:09 compute-0 ovn_controller[96653]: 2026-02-19T19:45:09Z|00172|binding|INFO|4adba740-03c5-45a4-8b6f-3cffa5076199: Claiming fa:16:3e:d2:5a:0c 10.100.0.10
Feb 19 19:45:09 compute-0 nova_compute[186662]: 2026-02-19 19:45:09.186 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:09 compute-0 nova_compute[186662]: 2026-02-19 19:45:09.189 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:09 compute-0 nova_compute[186662]: 2026-02-19 19:45:09.191 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.197 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:5a:0c 10.100.0.10'], port_security=['fa:16:3e:d2:5a:0c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '98af52ad-2964-4557-85f1-eb0343d8f085', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e7e566fe9744481b8016f2a804a68c2b', 'neutron:revision_number': '10', 'neutron:security_group_ids': '1f057594-2461-4678-b2b0-6a6c86e352f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a41cc354-2c45-4738-99b1-e4951b7f67ae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=4adba740-03c5-45a4-8b6f-3cffa5076199) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.198 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 4adba740-03c5-45a4-8b6f-3cffa5076199 in datapath c593f4fa-8caf-4204-a168-7d36dea7afd9 unbound from our chassis
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.199 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c593f4fa-8caf-4204-a168-7d36dea7afd9
Feb 19 19:45:09 compute-0 nova_compute[186662]: 2026-02-19 19:45:09.205 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:09 compute-0 ovn_controller[96653]: 2026-02-19T19:45:09Z|00173|binding|INFO|Setting lport 4adba740-03c5-45a4-8b6f-3cffa5076199 ovn-installed in OVS
Feb 19 19:45:09 compute-0 nova_compute[186662]: 2026-02-19 19:45:09.208 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.208 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[013afb45-9118-4868-8bf4-c15d86b8f037]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.209 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc593f4fa-81 in ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:45:09 compute-0 systemd-machined[156014]: New machine qemu-16-instance-00000016.
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.211 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc593f4fa-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.211 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[cdfc93ef-2455-46b7-a712-79757edab443]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.212 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[0f0aa113-5512-43a4-9254-e27750361345]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.219 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[1916b7a4-d954-4483-83df-6b9a9bf6a0d5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000016.
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.225 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[eab37fec-5d93-481e-be3e-1f39503c356c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 systemd-udevd[215554]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.245 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[ba41193e-cfc8-4398-9d34-1dc21aabdad6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 NetworkManager[56519]: <info>  [1771530309.2473] device (tap4adba740-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:45:09 compute-0 NetworkManager[56519]: <info>  [1771530309.2481] device (tap4adba740-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.249 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[2be00d60-2d61-45a8-ada3-fbf5ff9fb24c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 NetworkManager[56519]: <info>  [1771530309.2510] manager: (tapc593f4fa-80): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.273 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[6afef303-95e9-47e2-bf3c-9d180b171115]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.275 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[0312cd20-5cc6-4434-bf0c-96b61a045e2c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 NetworkManager[56519]: <info>  [1771530309.2902] device (tapc593f4fa-80): carrier: link connected
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.294 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[4a345db5-dbe0-4efc-ad06-052fbe53cd67]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.304 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[49540757-2375-4c6a-bcae-881322bf38a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc593f4fa-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:91:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457885, 'reachable_time': 33733, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215582, 'error': None, 'target': 'ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.314 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ebcb0c7c-51bf-4f26-843c-ae92f1a12972]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef5:915e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457885, 'tstamp': 457885}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215583, 'error': None, 'target': 'ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.324 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[68088059-7dac-4002-8450-7de8d1cdf66a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc593f4fa-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:91:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457885, 'reachable_time': 33733, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215584, 'error': None, 'target': 'ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.339 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[4b51f5f9-a129-483d-81c9-085b93265f15]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.380 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7f1f642b-3461-43d0-aa3f-06c7febde013]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.381 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc593f4fa-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.382 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.382 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc593f4fa-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:45:09 compute-0 nova_compute[186662]: 2026-02-19 19:45:09.383 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:09 compute-0 NetworkManager[56519]: <info>  [1771530309.3843] manager: (tapc593f4fa-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Feb 19 19:45:09 compute-0 kernel: tapc593f4fa-80: entered promiscuous mode
Feb 19 19:45:09 compute-0 nova_compute[186662]: 2026-02-19 19:45:09.386 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.387 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc593f4fa-80, col_values=(('external_ids', {'iface-id': '35fb9fb2-2f23-4514-91e1-a3a8a8349d32'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:45:09 compute-0 nova_compute[186662]: 2026-02-19 19:45:09.387 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:09 compute-0 ovn_controller[96653]: 2026-02-19T19:45:09Z|00174|binding|INFO|Releasing lport 35fb9fb2-2f23-4514-91e1-a3a8a8349d32 from this chassis (sb_readonly=0)
Feb 19 19:45:09 compute-0 nova_compute[186662]: 2026-02-19 19:45:09.388 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:09 compute-0 nova_compute[186662]: 2026-02-19 19:45:09.393 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.393 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[5dcbe797-d23b-49c4-998c-690c456fbc1e]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.394 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.394 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.394 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for c593f4fa-8caf-4204-a168-7d36dea7afd9 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.395 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.395 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[479eda7e-5419-4d44-9242-080a0f882d23]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.395 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.396 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[dbfae512-dbf8-46d6-9599-e98f7fc6e77d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.396 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-c593f4fa-8caf-4204-a168-7d36dea7afd9
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID c593f4fa-8caf-4204-a168-7d36dea7afd9
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:45:09 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:09.396 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'env', 'PROCESS_TAG=haproxy-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c593f4fa-8caf-4204-a168-7d36dea7afd9.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:45:09 compute-0 podman[215623]: 2026-02-19 19:45:09.70991464 +0000 UTC m=+0.047651565 container create a5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 19 19:45:09 compute-0 systemd[1]: Started libpod-conmon-a5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d.scope.
Feb 19 19:45:09 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:45:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b7bd67405db889a5fa1f995ff5c006c90d6840da87bb054cab9bad3b21f800/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:45:09 compute-0 podman[215623]: 2026-02-19 19:45:09.678042447 +0000 UTC m=+0.015779382 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:45:09 compute-0 podman[215623]: 2026-02-19 19:45:09.785801808 +0000 UTC m=+0.123538763 container init a5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:45:09 compute-0 podman[215623]: 2026-02-19 19:45:09.793247689 +0000 UTC m=+0.130984644 container start a5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 19 19:45:09 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[215639]: [NOTICE]   (215643) : New worker (215645) forked
Feb 19 19:45:09 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[215639]: [NOTICE]   (215643) : Loading success.
Feb 19 19:45:10 compute-0 nova_compute[186662]: 2026-02-19 19:45:10.680 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:11 compute-0 ovn_controller[96653]: 2026-02-19T19:45:11Z|00175|binding|INFO|Claiming lport 4adba740-03c5-45a4-8b6f-3cffa5076199 for this chassis.
Feb 19 19:45:11 compute-0 ovn_controller[96653]: 2026-02-19T19:45:11Z|00176|binding|INFO|4adba740-03c5-45a4-8b6f-3cffa5076199: Claiming fa:16:3e:d2:5a:0c 10.100.0.10
Feb 19 19:45:11 compute-0 ovn_controller[96653]: 2026-02-19T19:45:11Z|00177|binding|INFO|Setting lport 4adba740-03c5-45a4-8b6f-3cffa5076199 up in Southbound
Feb 19 19:45:12 compute-0 nova_compute[186662]: 2026-02-19 19:45:12.528 186666 INFO nova.compute.manager [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Post operation of migration started
Feb 19 19:45:12 compute-0 nova_compute[186662]: 2026-02-19 19:45:12.529 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:13 compute-0 nova_compute[186662]: 2026-02-19 19:45:13.005 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:13 compute-0 nova_compute[186662]: 2026-02-19 19:45:13.006 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:13 compute-0 nova_compute[186662]: 2026-02-19 19:45:13.083 186666 DEBUG oslo_concurrency.lockutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-98af52ad-2964-4557-85f1-eb0343d8f085" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:45:13 compute-0 nova_compute[186662]: 2026-02-19 19:45:13.083 186666 DEBUG oslo_concurrency.lockutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-98af52ad-2964-4557-85f1-eb0343d8f085" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:45:13 compute-0 nova_compute[186662]: 2026-02-19 19:45:13.083 186666 DEBUG nova.network.neutron [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:45:13 compute-0 nova_compute[186662]: 2026-02-19 19:45:13.589 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:13 compute-0 nova_compute[186662]: 2026-02-19 19:45:13.937 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:14 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:14.318 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:45:14 compute-0 nova_compute[186662]: 2026-02-19 19:45:14.339 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:14 compute-0 nova_compute[186662]: 2026-02-19 19:45:14.473 186666 DEBUG nova.network.neutron [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Updating instance_info_cache with network_info: [{"id": "4adba740-03c5-45a4-8b6f-3cffa5076199", "address": "fa:16:3e:d2:5a:0c", "network": {"id": "c593f4fa-8caf-4204-a168-7d36dea7afd9", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-530295078-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "494c6c3df70542bd8ad63a3ad2241fe8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4adba740-03", "ovs_interfaceid": "4adba740-03c5-45a4-8b6f-3cffa5076199", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:45:14 compute-0 nova_compute[186662]: 2026-02-19 19:45:14.978 186666 DEBUG oslo_concurrency.lockutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-98af52ad-2964-4557-85f1-eb0343d8f085" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:45:15 compute-0 nova_compute[186662]: 2026-02-19 19:45:15.495 186666 DEBUG oslo_concurrency.lockutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:45:15 compute-0 nova_compute[186662]: 2026-02-19 19:45:15.495 186666 DEBUG oslo_concurrency.lockutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:45:15 compute-0 nova_compute[186662]: 2026-02-19 19:45:15.495 186666 DEBUG oslo_concurrency.lockutils [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:45:15 compute-0 nova_compute[186662]: 2026-02-19 19:45:15.499 186666 INFO nova.virt.libvirt.driver [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 19 19:45:15 compute-0 virtqemud[186157]: Domain id=16 name='instance-00000016' uuid=98af52ad-2964-4557-85f1-eb0343d8f085 is tainted: custom-monitor
Feb 19 19:45:15 compute-0 nova_compute[186662]: 2026-02-19 19:45:15.682 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:16 compute-0 nova_compute[186662]: 2026-02-19 19:45:16.504 186666 INFO nova.virt.libvirt.driver [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 19 19:45:17 compute-0 nova_compute[186662]: 2026-02-19 19:45:17.509 186666 INFO nova.virt.libvirt.driver [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 19 19:45:17 compute-0 nova_compute[186662]: 2026-02-19 19:45:17.512 186666 DEBUG nova.compute.manager [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:45:18 compute-0 nova_compute[186662]: 2026-02-19 19:45:18.022 186666 DEBUG nova.objects.instance [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Feb 19 19:45:18 compute-0 nova_compute[186662]: 2026-02-19 19:45:18.972 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:19 compute-0 nova_compute[186662]: 2026-02-19 19:45:19.039 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:19 compute-0 nova_compute[186662]: 2026-02-19 19:45:19.109 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:19 compute-0 nova_compute[186662]: 2026-02-19 19:45:19.110 186666 WARNING neutronclient.v2_0.client [None req-989c7eb7-cf54-45e3-b3d9-10d9d81618fc 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:19 compute-0 nova_compute[186662]: 2026-02-19 19:45:19.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:45:20 compute-0 nova_compute[186662]: 2026-02-19 19:45:20.684 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:23 compute-0 nova_compute[186662]: 2026-02-19 19:45:23.081 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:45:23 compute-0 nova_compute[186662]: 2026-02-19 19:45:23.977 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:25 compute-0 nova_compute[186662]: 2026-02-19 19:45:25.577 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:45:25 compute-0 nova_compute[186662]: 2026-02-19 19:45:25.685 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:27 compute-0 nova_compute[186662]: 2026-02-19 19:45:27.577 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:45:28 compute-0 nova_compute[186662]: 2026-02-19 19:45:28.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:45:28 compute-0 nova_compute[186662]: 2026-02-19 19:45:28.576 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:45:28 compute-0 nova_compute[186662]: 2026-02-19 19:45:28.979 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:29 compute-0 podman[215669]: 2026-02-19 19:45:29.274378136 +0000 UTC m=+0.040066833 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:45:29 compute-0 nova_compute[186662]: 2026-02-19 19:45:29.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:45:29 compute-0 podman[196025]: time="2026-02-19T19:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:45:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:45:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2653 "" "Go-http-client/1.1"
Feb 19 19:45:30 compute-0 nova_compute[186662]: 2026-02-19 19:45:30.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:45:30 compute-0 nova_compute[186662]: 2026-02-19 19:45:30.688 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:31 compute-0 nova_compute[186662]: 2026-02-19 19:45:31.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:45:31 compute-0 nova_compute[186662]: 2026-02-19 19:45:31.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:45:31 compute-0 nova_compute[186662]: 2026-02-19 19:45:31.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:45:31 compute-0 nova_compute[186662]: 2026-02-19 19:45:31.091 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:45:31 compute-0 openstack_network_exporter[198916]: ERROR   19:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:45:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:45:31 compute-0 openstack_network_exporter[198916]: ERROR   19:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:45:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:45:32 compute-0 nova_compute[186662]: 2026-02-19 19:45:32.131 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:45:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:32.150 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:45:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:32.151 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:45:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:32.151 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:45:32 compute-0 nova_compute[186662]: 2026-02-19 19:45:32.189 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:45:32 compute-0 nova_compute[186662]: 2026-02-19 19:45:32.190 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:45:32 compute-0 nova_compute[186662]: 2026-02-19 19:45:32.232 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:45:32 compute-0 nova_compute[186662]: 2026-02-19 19:45:32.382 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:45:32 compute-0 nova_compute[186662]: 2026-02-19 19:45:32.383 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:45:32 compute-0 nova_compute[186662]: 2026-02-19 19:45:32.394 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:45:32 compute-0 nova_compute[186662]: 2026-02-19 19:45:32.395 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5646MB free_disk=72.94668579101562GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:45:32 compute-0 nova_compute[186662]: 2026-02-19 19:45:32.395 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:45:32 compute-0 nova_compute[186662]: 2026-02-19 19:45:32.395 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:45:33 compute-0 podman[215698]: 2026-02-19 19:45:33.278871105 +0000 UTC m=+0.053484594 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, distribution-scope=public, config_id=openstack_network_exporter)
Feb 19 19:45:33 compute-0 nova_compute[186662]: 2026-02-19 19:45:33.943 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 98af52ad-2964-4557-85f1-eb0343d8f085 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:45:33 compute-0 nova_compute[186662]: 2026-02-19 19:45:33.944 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:45:33 compute-0 nova_compute[186662]: 2026-02-19 19:45:33.944 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:45:32 up  1:16,  0 user,  load average: 0.02, 0.17, 0.27\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e7e566fe9744481b8016f2a804a68c2b': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:45:33 compute-0 nova_compute[186662]: 2026-02-19 19:45:33.978 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:45:33 compute-0 nova_compute[186662]: 2026-02-19 19:45:33.981 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:34 compute-0 nova_compute[186662]: 2026-02-19 19:45:34.487 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:45:34 compute-0 nova_compute[186662]: 2026-02-19 19:45:34.996 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:45:34 compute-0 nova_compute[186662]: 2026-02-19 19:45:34.997 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.601s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.047 186666 DEBUG oslo_concurrency.lockutils [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Acquiring lock "98af52ad-2964-4557-85f1-eb0343d8f085" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.048 186666 DEBUG oslo_concurrency.lockutils [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lock "98af52ad-2964-4557-85f1-eb0343d8f085" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.049 186666 DEBUG oslo_concurrency.lockutils [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Acquiring lock "98af52ad-2964-4557-85f1-eb0343d8f085-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.049 186666 DEBUG oslo_concurrency.lockutils [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lock "98af52ad-2964-4557-85f1-eb0343d8f085-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.050 186666 DEBUG oslo_concurrency.lockutils [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lock "98af52ad-2964-4557-85f1-eb0343d8f085-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.065 186666 INFO nova.compute.manager [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Terminating instance
Feb 19 19:45:35 compute-0 podman[215720]: 2026-02-19 19:45:35.317684579 +0000 UTC m=+0.085167747 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.589 186666 DEBUG nova.compute.manager [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:45:35 compute-0 kernel: tap4adba740-03 (unregistering): left promiscuous mode
Feb 19 19:45:35 compute-0 NetworkManager[56519]: <info>  [1771530335.6226] device (tap4adba740-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:45:35 compute-0 ovn_controller[96653]: 2026-02-19T19:45:35Z|00178|binding|INFO|Releasing lport 4adba740-03c5-45a4-8b6f-3cffa5076199 from this chassis (sb_readonly=0)
Feb 19 19:45:35 compute-0 ovn_controller[96653]: 2026-02-19T19:45:35Z|00179|binding|INFO|Setting lport 4adba740-03c5-45a4-8b6f-3cffa5076199 down in Southbound
Feb 19 19:45:35 compute-0 ovn_controller[96653]: 2026-02-19T19:45:35Z|00180|binding|INFO|Removing iface tap4adba740-03 ovn-installed in OVS
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.631 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.632 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.639 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:5a:0c 10.100.0.10'], port_security=['fa:16:3e:d2:5a:0c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '98af52ad-2964-4557-85f1-eb0343d8f085', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e7e566fe9744481b8016f2a804a68c2b', 'neutron:revision_number': '15', 'neutron:security_group_ids': '1f057594-2461-4678-b2b0-6a6c86e352f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a41cc354-2c45-4738-99b1-e4951b7f67ae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=4adba740-03c5-45a4-8b6f-3cffa5076199) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.638 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.640 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 4adba740-03c5-45a4-8b6f-3cffa5076199 in datapath c593f4fa-8caf-4204-a168-7d36dea7afd9 unbound from our chassis
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.641 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c593f4fa-8caf-4204-a168-7d36dea7afd9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.643 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[345a6c9e-2658-4f8c-ba70-1cea399096f0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.644 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9 namespace which is not needed anymore
Feb 19 19:45:35 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000016.scope: Deactivated successfully.
Feb 19 19:45:35 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000016.scope: Consumed 1.831s CPU time.
Feb 19 19:45:35 compute-0 systemd-machined[156014]: Machine qemu-16-instance-00000016 terminated.
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.689 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:35 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[215639]: [NOTICE]   (215643) : haproxy version is 3.0.5-8e879a5
Feb 19 19:45:35 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[215639]: [NOTICE]   (215643) : path to executable is /usr/sbin/haproxy
Feb 19 19:45:35 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[215639]: [WARNING]  (215643) : Exiting Master process...
Feb 19 19:45:35 compute-0 podman[215770]: 2026-02-19 19:45:35.761288202 +0000 UTC m=+0.032311630 container kill a5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, io.buildah.version=1.43.0)
Feb 19 19:45:35 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[215639]: [ALERT]    (215643) : Current worker (215645) exited with code 143 (Terminated)
Feb 19 19:45:35 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[215639]: [WARNING]  (215643) : All workers exited. Exiting... (0)
Feb 19 19:45:35 compute-0 systemd[1]: libpod-a5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d.scope: Deactivated successfully.
Feb 19 19:45:35 compute-0 podman[215785]: 2026-02-19 19:45:35.811581058 +0000 UTC m=+0.033369274 container died a5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 19 19:45:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d-userdata-shm.mount: Deactivated successfully.
Feb 19 19:45:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-39b7bd67405db889a5fa1f995ff5c006c90d6840da87bb054cab9bad3b21f800-merged.mount: Deactivated successfully.
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.843 186666 INFO nova.virt.libvirt.driver [-] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Instance destroyed successfully.
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.844 186666 DEBUG nova.objects.instance [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lazy-loading 'resources' on Instance uuid 98af52ad-2964-4557-85f1-eb0343d8f085 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:45:35 compute-0 podman[215785]: 2026-02-19 19:45:35.848459226 +0000 UTC m=+0.070247402 container cleanup a5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 19 19:45:35 compute-0 systemd[1]: libpod-conmon-a5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d.scope: Deactivated successfully.
Feb 19 19:45:35 compute-0 podman[215787]: 2026-02-19 19:45:35.864602109 +0000 UTC m=+0.083831735 container remove a5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.868 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[cd0c9d00-f856-487f-8b31-fe37b69f4a7a]: (4, ("Thu Feb 19 07:45:35 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9 (a5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d)\na5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d\nThu Feb 19 07:45:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9 (a5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d)\na5cb80b748e0fd2253661e981672e71f3146e36ce9ff8134588967f9ec7fc34d\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.869 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[0dd12a4f-9fc3-4910-8242-a036abcad5c7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.870 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.870 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e622d0b0-0a86-41c0-8c37-bc43d387703a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.871 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc593f4fa-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.872 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:35 compute-0 kernel: tapc593f4fa-80: left promiscuous mode
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.880 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.882 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[73be5675-9ccf-4369-a3e6-bc549bdfd7ee]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.897 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e28ae0-7c68-4a95-8712-8961766fd315]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.897 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[634bf0cd-b75e-499c-adc2-56b4fe22fa6f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.906 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a3a6c474-be18-417c-8bfa-07a16e811ca6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457880, 'reachable_time': 26020, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215840, 'error': None, 'target': 'ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.909 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:45:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:45:35.909 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[d00c526c-6719-4b93-9616-cfe6dfda4ff7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:45:35 compute-0 systemd[1]: run-netns-ovnmeta\x2dc593f4fa\x2d8caf\x2d4204\x2da168\x2d7d36dea7afd9.mount: Deactivated successfully.
Feb 19 19:45:35 compute-0 nova_compute[186662]: 2026-02-19 19:45:35.993 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.097 186666 DEBUG nova.compute.manager [req-d699e6bf-f0e4-411c-9e12-abb7ce338cdd req-0d08da7c-29fc-4ea5-8ec3-70b19d149f4d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Received event network-vif-unplugged-4adba740-03c5-45a4-8b6f-3cffa5076199 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.097 186666 DEBUG oslo_concurrency.lockutils [req-d699e6bf-f0e4-411c-9e12-abb7ce338cdd req-0d08da7c-29fc-4ea5-8ec3-70b19d149f4d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "98af52ad-2964-4557-85f1-eb0343d8f085-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.098 186666 DEBUG oslo_concurrency.lockutils [req-d699e6bf-f0e4-411c-9e12-abb7ce338cdd req-0d08da7c-29fc-4ea5-8ec3-70b19d149f4d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "98af52ad-2964-4557-85f1-eb0343d8f085-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.099 186666 DEBUG oslo_concurrency.lockutils [req-d699e6bf-f0e4-411c-9e12-abb7ce338cdd req-0d08da7c-29fc-4ea5-8ec3-70b19d149f4d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "98af52ad-2964-4557-85f1-eb0343d8f085-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.099 186666 DEBUG nova.compute.manager [req-d699e6bf-f0e4-411c-9e12-abb7ce338cdd req-0d08da7c-29fc-4ea5-8ec3-70b19d149f4d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] No waiting events found dispatching network-vif-unplugged-4adba740-03c5-45a4-8b6f-3cffa5076199 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.100 186666 DEBUG nova.compute.manager [req-d699e6bf-f0e4-411c-9e12-abb7ce338cdd req-0d08da7c-29fc-4ea5-8ec3-70b19d149f4d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Received event network-vif-unplugged-4adba740-03c5-45a4-8b6f-3cffa5076199 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.350 186666 DEBUG nova.virt.libvirt.vif [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-02-19T19:44:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-339461723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-339461723',id=22,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:44:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e7e566fe9744481b8016f2a804a68c2b',ramdisk_id='',reservation_id='r-acjlae8w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-285058964',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-285058964-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:45:18Z,user_data=None,user_id='eb789bc56fc941ed8761873b0815c588',uuid=98af52ad-2964-4557-85f1-eb0343d8f085,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4adba740-03c5-45a4-8b6f-3cffa5076199", "address": "fa:16:3e:d2:5a:0c", "network": {"id": "c593f4fa-8caf-4204-a168-7d36dea7afd9", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-530295078-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "494c6c3df70542bd8ad63a3ad2241fe8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4adba740-03", "ovs_interfaceid": "4adba740-03c5-45a4-8b6f-3cffa5076199", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.351 186666 DEBUG nova.network.os_vif_util [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Converting VIF {"id": "4adba740-03c5-45a4-8b6f-3cffa5076199", "address": "fa:16:3e:d2:5a:0c", "network": {"id": "c593f4fa-8caf-4204-a168-7d36dea7afd9", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-530295078-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "494c6c3df70542bd8ad63a3ad2241fe8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4adba740-03", "ovs_interfaceid": "4adba740-03c5-45a4-8b6f-3cffa5076199", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.353 186666 DEBUG nova.network.os_vif_util [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d2:5a:0c,bridge_name='br-int',has_traffic_filtering=True,id=4adba740-03c5-45a4-8b6f-3cffa5076199,network=Network(c593f4fa-8caf-4204-a168-7d36dea7afd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4adba740-03') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.353 186666 DEBUG os_vif [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:5a:0c,bridge_name='br-int',has_traffic_filtering=True,id=4adba740-03c5-45a4-8b6f-3cffa5076199,network=Network(c593f4fa-8caf-4204-a168-7d36dea7afd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4adba740-03') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.356 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.356 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4adba740-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.358 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.360 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.361 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.362 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=ad6aa918-426b-488b-9790-f349a583684f) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.363 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.364 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.366 186666 INFO os_vif [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:5a:0c,bridge_name='br-int',has_traffic_filtering=True,id=4adba740-03c5-45a4-8b6f-3cffa5076199,network=Network(c593f4fa-8caf-4204-a168-7d36dea7afd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4adba740-03')
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.367 186666 INFO nova.virt.libvirt.driver [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Deleting instance files /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085_del
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.368 186666 INFO nova.virt.libvirt.driver [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Deletion of /var/lib/nova/instances/98af52ad-2964-4557-85f1-eb0343d8f085_del complete
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.503 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.503 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.880 186666 INFO nova.compute.manager [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Took 1.29 seconds to destroy the instance on the hypervisor.
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.881 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.881 186666 DEBUG nova.compute.manager [-] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.882 186666 DEBUG nova.network.neutron [-] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.882 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:36 compute-0 nova_compute[186662]: 2026-02-19 19:45:36.997 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:45:37 compute-0 nova_compute[186662]: 2026-02-19 19:45:37.429 186666 DEBUG nova.compute.manager [req-aa4645a8-fdc3-4fb5-ba13-849b496e9fd5 req-2e079379-ee14-48c6-aca6-3442b320337f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Received event network-vif-deleted-4adba740-03c5-45a4-8b6f-3cffa5076199 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:45:37 compute-0 nova_compute[186662]: 2026-02-19 19:45:37.429 186666 INFO nova.compute.manager [req-aa4645a8-fdc3-4fb5-ba13-849b496e9fd5 req-2e079379-ee14-48c6-aca6-3442b320337f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Neutron deleted interface 4adba740-03c5-45a4-8b6f-3cffa5076199; detaching it from the instance and deleting it from the info cache
Feb 19 19:45:37 compute-0 nova_compute[186662]: 2026-02-19 19:45:37.430 186666 DEBUG nova.network.neutron [req-aa4645a8-fdc3-4fb5-ba13-849b496e9fd5 req-2e079379-ee14-48c6-aca6-3442b320337f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:45:37 compute-0 nova_compute[186662]: 2026-02-19 19:45:37.880 186666 DEBUG nova.network.neutron [-] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:45:37 compute-0 nova_compute[186662]: 2026-02-19 19:45:37.935 186666 DEBUG nova.compute.manager [req-aa4645a8-fdc3-4fb5-ba13-849b496e9fd5 req-2e079379-ee14-48c6-aca6-3442b320337f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Detach interface failed, port_id=4adba740-03c5-45a4-8b6f-3cffa5076199, reason: Instance 98af52ad-2964-4557-85f1-eb0343d8f085 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:45:38 compute-0 nova_compute[186662]: 2026-02-19 19:45:38.158 186666 DEBUG nova.compute.manager [req-ac51449f-c5d0-4c53-a09f-be33ac1fc47b req-c7eb0c13-f4a9-42ec-abbe-ed10421999d0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Received event network-vif-unplugged-4adba740-03c5-45a4-8b6f-3cffa5076199 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:45:38 compute-0 nova_compute[186662]: 2026-02-19 19:45:38.159 186666 DEBUG oslo_concurrency.lockutils [req-ac51449f-c5d0-4c53-a09f-be33ac1fc47b req-c7eb0c13-f4a9-42ec-abbe-ed10421999d0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "98af52ad-2964-4557-85f1-eb0343d8f085-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:45:38 compute-0 nova_compute[186662]: 2026-02-19 19:45:38.159 186666 DEBUG oslo_concurrency.lockutils [req-ac51449f-c5d0-4c53-a09f-be33ac1fc47b req-c7eb0c13-f4a9-42ec-abbe-ed10421999d0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "98af52ad-2964-4557-85f1-eb0343d8f085-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:45:38 compute-0 nova_compute[186662]: 2026-02-19 19:45:38.160 186666 DEBUG oslo_concurrency.lockutils [req-ac51449f-c5d0-4c53-a09f-be33ac1fc47b req-c7eb0c13-f4a9-42ec-abbe-ed10421999d0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "98af52ad-2964-4557-85f1-eb0343d8f085-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:45:38 compute-0 nova_compute[186662]: 2026-02-19 19:45:38.160 186666 DEBUG nova.compute.manager [req-ac51449f-c5d0-4c53-a09f-be33ac1fc47b req-c7eb0c13-f4a9-42ec-abbe-ed10421999d0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] No waiting events found dispatching network-vif-unplugged-4adba740-03c5-45a4-8b6f-3cffa5076199 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:45:38 compute-0 nova_compute[186662]: 2026-02-19 19:45:38.160 186666 DEBUG nova.compute.manager [req-ac51449f-c5d0-4c53-a09f-be33ac1fc47b req-c7eb0c13-f4a9-42ec-abbe-ed10421999d0 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Received event network-vif-unplugged-4adba740-03c5-45a4-8b6f-3cffa5076199 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:45:38 compute-0 nova_compute[186662]: 2026-02-19 19:45:38.386 186666 INFO nova.compute.manager [-] [instance: 98af52ad-2964-4557-85f1-eb0343d8f085] Took 1.50 seconds to deallocate network for instance.
Feb 19 19:45:38 compute-0 nova_compute[186662]: 2026-02-19 19:45:38.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:45:38 compute-0 nova_compute[186662]: 2026-02-19 19:45:38.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Feb 19 19:45:38 compute-0 nova_compute[186662]: 2026-02-19 19:45:38.906 186666 DEBUG oslo_concurrency.lockutils [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:45:38 compute-0 nova_compute[186662]: 2026-02-19 19:45:38.907 186666 DEBUG oslo_concurrency.lockutils [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:45:38 compute-0 nova_compute[186662]: 2026-02-19 19:45:38.958 186666 DEBUG nova.compute.provider_tree [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:45:39 compute-0 nova_compute[186662]: 2026-02-19 19:45:39.080 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Feb 19 19:45:39 compute-0 podman[215841]: 2026-02-19 19:45:39.271766886 +0000 UTC m=+0.050302877 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:45:39 compute-0 nova_compute[186662]: 2026-02-19 19:45:39.464 186666 DEBUG nova.scheduler.client.report [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:45:39 compute-0 nova_compute[186662]: 2026-02-19 19:45:39.972 186666 DEBUG oslo_concurrency.lockutils [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.065s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:45:39 compute-0 nova_compute[186662]: 2026-02-19 19:45:39.994 186666 INFO nova.scheduler.client.report [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Deleted allocations for instance 98af52ad-2964-4557-85f1-eb0343d8f085
Feb 19 19:45:40 compute-0 nova_compute[186662]: 2026-02-19 19:45:40.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:45:40 compute-0 nova_compute[186662]: 2026-02-19 19:45:40.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Feb 19 19:45:40 compute-0 nova_compute[186662]: 2026-02-19 19:45:40.690 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:41 compute-0 nova_compute[186662]: 2026-02-19 19:45:41.017 186666 DEBUG oslo_concurrency.lockutils [None req-3c044a59-0c8b-4f5d-ae24-46c2165d1563 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lock "98af52ad-2964-4557-85f1-eb0343d8f085" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.969s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:45:41 compute-0 nova_compute[186662]: 2026-02-19 19:45:41.364 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:45 compute-0 nova_compute[186662]: 2026-02-19 19:45:45.691 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:46 compute-0 nova_compute[186662]: 2026-02-19 19:45:46.423 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:50 compute-0 nova_compute[186662]: 2026-02-19 19:45:50.692 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:51 compute-0 nova_compute[186662]: 2026-02-19 19:45:51.433 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:54 compute-0 sshd-session[215867]: Invalid user nexus from 189.165.79.177 port 60434
Feb 19 19:45:54 compute-0 sshd-session[215867]: Received disconnect from 189.165.79.177 port 60434:11: Bye Bye [preauth]
Feb 19 19:45:54 compute-0 sshd-session[215867]: Disconnected from invalid user nexus 189.165.79.177 port 60434 [preauth]
Feb 19 19:45:56 compute-0 nova_compute[186662]: 2026-02-19 19:45:56.037 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:56 compute-0 nova_compute[186662]: 2026-02-19 19:45:56.434 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:45:59 compute-0 podman[196025]: time="2026-02-19T19:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:45:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:45:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2199 "" "Go-http-client/1.1"
Feb 19 19:46:00 compute-0 podman[215869]: 2026-02-19 19:46:00.277259928 +0000 UTC m=+0.052843669 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest)
Feb 19 19:46:01 compute-0 nova_compute[186662]: 2026-02-19 19:46:01.040 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:01 compute-0 openstack_network_exporter[198916]: ERROR   19:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:46:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:46:01 compute-0 openstack_network_exporter[198916]: ERROR   19:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:46:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:46:01 compute-0 nova_compute[186662]: 2026-02-19 19:46:01.435 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:04 compute-0 podman[215888]: 2026-02-19 19:46:04.275200459 +0000 UTC m=+0.050823931 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, version=9.7, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Feb 19 19:46:06 compute-0 nova_compute[186662]: 2026-02-19 19:46:06.042 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:06 compute-0 nova_compute[186662]: 2026-02-19 19:46:06.220 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:46:06 compute-0 podman[215911]: 2026-02-19 19:46:06.292029729 +0000 UTC m=+0.071954153 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 19:46:06 compute-0 nova_compute[186662]: 2026-02-19 19:46:06.437 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:07 compute-0 nova_compute[186662]: 2026-02-19 19:46:07.273 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:07 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:07.274 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:46:07 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:07.274 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:46:10 compute-0 podman[215939]: 2026-02-19 19:46:10.2705974 +0000 UTC m=+0.048468675 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:46:11 compute-0 nova_compute[186662]: 2026-02-19 19:46:11.044 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:11 compute-0 nova_compute[186662]: 2026-02-19 19:46:11.438 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:16 compute-0 nova_compute[186662]: 2026-02-19 19:46:16.082 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:16 compute-0 nova_compute[186662]: 2026-02-19 19:46:16.440 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:17.275 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:46:21 compute-0 nova_compute[186662]: 2026-02-19 19:46:21.083 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:21 compute-0 nova_compute[186662]: 2026-02-19 19:46:21.442 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:23 compute-0 nova_compute[186662]: 2026-02-19 19:46:23.579 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:46:26 compute-0 nova_compute[186662]: 2026-02-19 19:46:26.085 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:26 compute-0 nova_compute[186662]: 2026-02-19 19:46:26.444 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:27 compute-0 nova_compute[186662]: 2026-02-19 19:46:27.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:46:29 compute-0 nova_compute[186662]: 2026-02-19 19:46:29.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:46:29 compute-0 nova_compute[186662]: 2026-02-19 19:46:29.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:46:29 compute-0 podman[196025]: time="2026-02-19T19:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:46:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:46:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Feb 19 19:46:30 compute-0 nova_compute[186662]: 2026-02-19 19:46:30.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:46:30 compute-0 nova_compute[186662]: 2026-02-19 19:46:30.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:46:31 compute-0 nova_compute[186662]: 2026-02-19 19:46:31.085 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:31 compute-0 nova_compute[186662]: 2026-02-19 19:46:31.265 186666 DEBUG nova.virt.libvirt.driver [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Creating tmpfile /var/lib/nova/instances/tmp4pezjjdu to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Feb 19 19:46:31 compute-0 nova_compute[186662]: 2026-02-19 19:46:31.265 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:31 compute-0 podman[215964]: 2026-02-19 19:46:31.270559361 +0000 UTC m=+0.049393286 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:46:31 compute-0 nova_compute[186662]: 2026-02-19 19:46:31.273 186666 DEBUG nova.compute.manager [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4pezjjdu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Feb 19 19:46:31 compute-0 openstack_network_exporter[198916]: ERROR   19:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:46:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:46:31 compute-0 openstack_network_exporter[198916]: ERROR   19:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:46:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:46:31 compute-0 nova_compute[186662]: 2026-02-19 19:46:31.463 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:32.152 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:46:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:32.154 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:46:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:32.154 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:46:32 compute-0 nova_compute[186662]: 2026-02-19 19:46:32.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:46:33 compute-0 nova_compute[186662]: 2026-02-19 19:46:33.091 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:46:33 compute-0 nova_compute[186662]: 2026-02-19 19:46:33.091 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:46:33 compute-0 nova_compute[186662]: 2026-02-19 19:46:33.092 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:46:33 compute-0 nova_compute[186662]: 2026-02-19 19:46:33.092 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:46:33 compute-0 nova_compute[186662]: 2026-02-19 19:46:33.214 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:46:33 compute-0 nova_compute[186662]: 2026-02-19 19:46:33.215 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:46:33 compute-0 nova_compute[186662]: 2026-02-19 19:46:33.227 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:46:33 compute-0 nova_compute[186662]: 2026-02-19 19:46:33.228 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5824MB free_disk=72.97544860839844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:46:33 compute-0 nova_compute[186662]: 2026-02-19 19:46:33.229 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:46:33 compute-0 nova_compute[186662]: 2026-02-19 19:46:33.229 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:46:33 compute-0 nova_compute[186662]: 2026-02-19 19:46:33.391 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:34 compute-0 nova_compute[186662]: 2026-02-19 19:46:34.793 186666 WARNING nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 33129d90-e523-40cb-8bf0-73fb4cc7e32e has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}.
Feb 19 19:46:34 compute-0 nova_compute[186662]: 2026-02-19 19:46:34.794 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:46:34 compute-0 nova_compute[186662]: 2026-02-19 19:46:34.794 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:46:33 up  1:17,  0 user,  load average: 0.04, 0.15, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:46:34 compute-0 nova_compute[186662]: 2026-02-19 19:46:34.867 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:46:35 compute-0 podman[215986]: 2026-02-19 19:46:35.288541039 +0000 UTC m=+0.064674960 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 19 19:46:35 compute-0 nova_compute[186662]: 2026-02-19 19:46:35.374 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:46:35 compute-0 nova_compute[186662]: 2026-02-19 19:46:35.885 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:46:35 compute-0 nova_compute[186662]: 2026-02-19 19:46:35.885 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.656s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:46:36 compute-0 nova_compute[186662]: 2026-02-19 19:46:36.087 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:36 compute-0 nova_compute[186662]: 2026-02-19 19:46:36.465 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:36 compute-0 nova_compute[186662]: 2026-02-19 19:46:36.886 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:46:36 compute-0 nova_compute[186662]: 2026-02-19 19:46:36.886 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:46:37 compute-0 podman[216009]: 2026-02-19 19:46:37.288939777 +0000 UTC m=+0.064197807 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller)
Feb 19 19:46:37 compute-0 nova_compute[186662]: 2026-02-19 19:46:37.557 186666 DEBUG nova.compute.manager [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4pezjjdu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='33129d90-e523-40cb-8bf0-73fb4cc7e32e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Feb 19 19:46:38 compute-0 nova_compute[186662]: 2026-02-19 19:46:38.568 186666 DEBUG oslo_concurrency.lockutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-33129d90-e523-40cb-8bf0-73fb4cc7e32e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:46:38 compute-0 nova_compute[186662]: 2026-02-19 19:46:38.568 186666 DEBUG oslo_concurrency.lockutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-33129d90-e523-40cb-8bf0-73fb4cc7e32e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:46:38 compute-0 nova_compute[186662]: 2026-02-19 19:46:38.569 186666 DEBUG nova.network.neutron [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:46:39 compute-0 nova_compute[186662]: 2026-02-19 19:46:39.075 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:39 compute-0 nova_compute[186662]: 2026-02-19 19:46:39.399 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:39 compute-0 nova_compute[186662]: 2026-02-19 19:46:39.540 186666 DEBUG nova.network.neutron [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Updating instance_info_cache with network_info: [{"id": "96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3", "address": "fa:16:3e:08:46:4f", "network": {"id": "c593f4fa-8caf-4204-a168-7d36dea7afd9", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-530295078-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "494c6c3df70542bd8ad63a3ad2241fe8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96744eb1-bc", "ovs_interfaceid": "96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.046 186666 DEBUG oslo_concurrency.lockutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-33129d90-e523-40cb-8bf0-73fb4cc7e32e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.061 186666 DEBUG nova.virt.libvirt.driver [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4pezjjdu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='33129d90-e523-40cb-8bf0-73fb4cc7e32e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.061 186666 DEBUG nova.virt.libvirt.driver [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Creating instance directory: /var/lib/nova/instances/33129d90-e523-40cb-8bf0-73fb4cc7e32e pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.062 186666 DEBUG nova.virt.libvirt.driver [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Creating disk.info with the contents: {'/var/lib/nova/instances/33129d90-e523-40cb-8bf0-73fb4cc7e32e/disk': 'qcow2', '/var/lib/nova/instances/33129d90-e523-40cb-8bf0-73fb4cc7e32e/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.062 186666 DEBUG nova.virt.libvirt.driver [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.063 186666 DEBUG nova.objects.instance [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 33129d90-e523-40cb-8bf0-73fb4cc7e32e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.568 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.571 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.572 186666 DEBUG oslo_concurrency.processutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.614 186666 DEBUG oslo_concurrency.processutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.615 186666 DEBUG oslo_concurrency.lockutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.615 186666 DEBUG oslo_concurrency.lockutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.615 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.618 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.618 186666 DEBUG oslo_concurrency.processutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.666 186666 DEBUG oslo_concurrency.processutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.667 186666 DEBUG oslo_concurrency.processutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/33129d90-e523-40cb-8bf0-73fb4cc7e32e/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.695 186666 DEBUG oslo_concurrency.processutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/33129d90-e523-40cb-8bf0-73fb4cc7e32e/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.696 186666 DEBUG oslo_concurrency.lockutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.081s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.697 186666 DEBUG oslo_concurrency.processutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.736 186666 DEBUG oslo_concurrency.processutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.737 186666 DEBUG nova.virt.disk.api [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Checking if we can resize image /var/lib/nova/instances/33129d90-e523-40cb-8bf0-73fb4cc7e32e/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.737 186666 DEBUG oslo_concurrency.processutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33129d90-e523-40cb-8bf0-73fb4cc7e32e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.779 186666 DEBUG oslo_concurrency.processutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33129d90-e523-40cb-8bf0-73fb4cc7e32e/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.780 186666 DEBUG nova.virt.disk.api [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Cannot resize image /var/lib/nova/instances/33129d90-e523-40cb-8bf0-73fb4cc7e32e/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:46:40 compute-0 nova_compute[186662]: 2026-02-19 19:46:40.780 186666 DEBUG nova.objects.instance [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid 33129d90-e523-40cb-8bf0-73fb4cc7e32e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.135 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:41 compute-0 podman[216050]: 2026-02-19 19:46:41.269534896 +0000 UTC m=+0.050352309 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.285 186666 DEBUG nova.objects.base [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<33129d90-e523-40cb-8bf0-73fb4cc7e32e> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.286 186666 DEBUG oslo_concurrency.processutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/33129d90-e523-40cb-8bf0-73fb4cc7e32e/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.298 186666 DEBUG oslo_concurrency.processutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/33129d90-e523-40cb-8bf0-73fb4cc7e32e/disk.config 497664" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.299 186666 DEBUG nova.virt.libvirt.driver [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.300 186666 DEBUG nova.virt.libvirt.vif [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-02-19T19:45:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-361117094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-361117094',id=24,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:45:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e7e566fe9744481b8016f2a804a68c2b',ramdisk_id='',reservation_id='r-qz508hpy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-285058964',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-285058964-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:45:58Z,user_data=None,user_id='eb789bc56fc941ed8761873b0815c588',uuid=33129d90-e523-40cb-8bf0-73fb4cc7e32e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3", "address": "fa:16:3e:08:46:4f", "network": {"id": "c593f4fa-8caf-4204-a168-7d36dea7afd9", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-530295078-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "494c6c3df70542bd8ad63a3ad2241fe8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap96744eb1-bc", "ovs_interfaceid": "96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.300 186666 DEBUG nova.network.os_vif_util [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3", "address": "fa:16:3e:08:46:4f", "network": {"id": "c593f4fa-8caf-4204-a168-7d36dea7afd9", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-530295078-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "494c6c3df70542bd8ad63a3ad2241fe8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap96744eb1-bc", "ovs_interfaceid": "96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.301 186666 DEBUG nova.network.os_vif_util [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:46:4f,bridge_name='br-int',has_traffic_filtering=True,id=96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3,network=Network(c593f4fa-8caf-4204-a168-7d36dea7afd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96744eb1-bc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.301 186666 DEBUG os_vif [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:46:4f,bridge_name='br-int',has_traffic_filtering=True,id=96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3,network=Network(c593f4fa-8caf-4204-a168-7d36dea7afd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96744eb1-bc') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.301 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.302 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.302 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.303 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.303 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '045a3afb-12aa-5f5a-b40e-8057fb662306', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.304 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.305 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.307 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.307 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap96744eb1-bc, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.308 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap96744eb1-bc, col_values=(('qos', UUID('42b2c406-231a-4b00-b071-954c4ba16e64')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.308 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap96744eb1-bc, col_values=(('external_ids', {'iface-id': '96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:08:46:4f', 'vm-uuid': '33129d90-e523-40cb-8bf0-73fb4cc7e32e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.309 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:41 compute-0 NetworkManager[56519]: <info>  [1771530401.3101] manager: (tap96744eb1-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.312 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.314 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.314 186666 INFO os_vif [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:46:4f,bridge_name='br-int',has_traffic_filtering=True,id=96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3,network=Network(c593f4fa-8caf-4204-a168-7d36dea7afd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96744eb1-bc')
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.315 186666 DEBUG nova.virt.libvirt.driver [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.315 186666 DEBUG nova.compute.manager [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4pezjjdu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='33129d90-e523-40cb-8bf0-73fb4cc7e32e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Feb 19 19:46:41 compute-0 nova_compute[186662]: 2026-02-19 19:46:41.316 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:42 compute-0 nova_compute[186662]: 2026-02-19 19:46:42.029 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:43 compute-0 nova_compute[186662]: 2026-02-19 19:46:43.308 186666 DEBUG nova.network.neutron [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Port 96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Feb 19 19:46:43 compute-0 nova_compute[186662]: 2026-02-19 19:46:43.321 186666 DEBUG nova.compute.manager [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4pezjjdu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='33129d90-e523-40cb-8bf0-73fb4cc7e32e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Feb 19 19:46:45 compute-0 ovn_controller[96653]: 2026-02-19T19:46:45Z|00181|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 19 19:46:46 compute-0 kernel: tap96744eb1-bc: entered promiscuous mode
Feb 19 19:46:46 compute-0 NetworkManager[56519]: <info>  [1771530406.0650] manager: (tap96744eb1-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Feb 19 19:46:46 compute-0 ovn_controller[96653]: 2026-02-19T19:46:46Z|00182|binding|INFO|Claiming lport 96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 for this additional chassis.
Feb 19 19:46:46 compute-0 ovn_controller[96653]: 2026-02-19T19:46:46Z|00183|binding|INFO|96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3: Claiming fa:16:3e:08:46:4f 10.100.0.3
Feb 19 19:46:46 compute-0 nova_compute[186662]: 2026-02-19 19:46:46.066 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.071 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:46:4f 10.100.0.3'], port_security=['fa:16:3e:08:46:4f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '33129d90-e523-40cb-8bf0-73fb4cc7e32e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e7e566fe9744481b8016f2a804a68c2b', 'neutron:revision_number': '10', 'neutron:security_group_ids': '1f057594-2461-4678-b2b0-6a6c86e352f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a41cc354-2c45-4738-99b1-e4951b7f67ae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.071 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 in datapath c593f4fa-8caf-4204-a168-7d36dea7afd9 unbound from our chassis
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.073 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c593f4fa-8caf-4204-a168-7d36dea7afd9
Feb 19 19:46:46 compute-0 nova_compute[186662]: 2026-02-19 19:46:46.075 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:46 compute-0 ovn_controller[96653]: 2026-02-19T19:46:46Z|00184|binding|INFO|Setting lport 96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 ovn-installed in OVS
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.082 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a00c1fc0-3c89-4b26-b5dc-d2d94fd748cd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.082 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc593f4fa-81 in ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.083 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc593f4fa-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.083 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[1779bc0b-3fd9-4f2b-852b-2cfc8cd2ada0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.084 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[fde326db-55fa-4d96-b9d0-59b083464652]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 systemd-machined[156014]: New machine qemu-17-instance-00000018.
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.094 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[6e5febb5-9cbb-4b36-b5db-305827e25b26]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.100 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[58a847c7-8d8d-46c6-a6b6-22066eb5d5c0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000018.
Feb 19 19:46:46 compute-0 systemd-udevd[216099]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:46:46 compute-0 NetworkManager[56519]: <info>  [1771530406.1231] device (tap96744eb1-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.121 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[c8489683-47f9-4b7e-94fb-d95a1cdc5263]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 NetworkManager[56519]: <info>  [1771530406.1241] device (tap96744eb1-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.125 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[12781bad-2bf2-47c9-91b3-87906f21737c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 NetworkManager[56519]: <info>  [1771530406.1265] manager: (tapc593f4fa-80): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Feb 19 19:46:46 compute-0 nova_compute[186662]: 2026-02-19 19:46:46.136 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.144 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[c2d9a113-32c7-47df-8195-aa866fd83d93]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.149 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[8c4ec6e8-e07f-4143-b559-a50929d0bfd5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 NetworkManager[56519]: <info>  [1771530406.1620] device (tapc593f4fa-80): carrier: link connected
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.163 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[e31cc6be-7aeb-426c-adf0-510af73ee2aa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.172 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[44a6f1c3-9665-40a0-b41b-f95940782b1a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc593f4fa-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:91:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467572, 'reachable_time': 30406, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216126, 'error': None, 'target': 'ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.179 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a7ae8ff5-5b24-47a5-aa82-cabddf668229]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef5:915e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467572, 'tstamp': 467572}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216127, 'error': None, 'target': 'ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.186 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c4f2d6-6f1c-4a89-bc02-d411a4516afe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc593f4fa-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:91:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467572, 'reachable_time': 30406, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216129, 'error': None, 'target': 'ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.203 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[bed90ad5-9e7d-4a73-9154-050fef7d9b32]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.232 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[9e5a8741-fe52-4e39-a8af-63cb4941c381]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.232 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc593f4fa-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.233 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.233 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc593f4fa-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:46:46 compute-0 nova_compute[186662]: 2026-02-19 19:46:46.235 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:46 compute-0 kernel: tapc593f4fa-80: entered promiscuous mode
Feb 19 19:46:46 compute-0 NetworkManager[56519]: <info>  [1771530406.2367] manager: (tapc593f4fa-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Feb 19 19:46:46 compute-0 nova_compute[186662]: 2026-02-19 19:46:46.237 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.238 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc593f4fa-80, col_values=(('external_ids', {'iface-id': '35fb9fb2-2f23-4514-91e1-a3a8a8349d32'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:46:46 compute-0 nova_compute[186662]: 2026-02-19 19:46:46.239 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:46 compute-0 ovn_controller[96653]: 2026-02-19T19:46:46Z|00185|binding|INFO|Releasing lport 35fb9fb2-2f23-4514-91e1-a3a8a8349d32 from this chassis (sb_readonly=0)
Feb 19 19:46:46 compute-0 nova_compute[186662]: 2026-02-19 19:46:46.239 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.240 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ce75df36-7d28-490e-8177-338d10982365]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.241 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.241 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.241 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for c593f4fa-8caf-4204-a168-7d36dea7afd9 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.241 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.242 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e019d67b-65e2-4bc2-a82d-be142f86d089]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.242 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.243 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[cecf5c0d-3b3f-43b1-9df8-b2e589b3fdf5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.243 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-c593f4fa-8caf-4204-a168-7d36dea7afd9
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID c593f4fa-8caf-4204-a168-7d36dea7afd9
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:46:46 compute-0 nova_compute[186662]: 2026-02-19 19:46:46.243 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:46 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:46:46.244 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'env', 'PROCESS_TAG=haproxy-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c593f4fa-8caf-4204-a168-7d36dea7afd9.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:46:46 compute-0 nova_compute[186662]: 2026-02-19 19:46:46.309 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:46 compute-0 nova_compute[186662]: 2026-02-19 19:46:46.501 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:46:46 compute-0 podman[216168]: 2026-02-19 19:46:46.554080586 +0000 UTC m=+0.018508521 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:46:46 compute-0 podman[216168]: 2026-02-19 19:46:46.708369676 +0000 UTC m=+0.172797631 container create aed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0)
Feb 19 19:46:46 compute-0 systemd[1]: Started libpod-conmon-aed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1.scope.
Feb 19 19:46:46 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:46:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f32445e587a5dcb835c1b74ca160e38f9b4d42b6b209f7f04712cbdbd1ceeb1f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:46:46 compute-0 podman[216168]: 2026-02-19 19:46:46.81442049 +0000 UTC m=+0.278848495 container init aed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Feb 19 19:46:46 compute-0 podman[216168]: 2026-02-19 19:46:46.81862467 +0000 UTC m=+0.283052635 container start aed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Feb 19 19:46:46 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[216183]: [NOTICE]   (216187) : New worker (216189) forked
Feb 19 19:46:46 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[216183]: [NOTICE]   (216187) : Loading success.
Feb 19 19:46:47 compute-0 nova_compute[186662]: 2026-02-19 19:46:47.032 186666 WARNING nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] While synchronizing instance power states, found 0 instances in the database and 1 instances on the hypervisor.
Feb 19 19:46:49 compute-0 ovn_controller[96653]: 2026-02-19T19:46:49Z|00186|binding|INFO|Claiming lport 96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 for this chassis.
Feb 19 19:46:49 compute-0 ovn_controller[96653]: 2026-02-19T19:46:49Z|00187|binding|INFO|96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3: Claiming fa:16:3e:08:46:4f 10.100.0.3
Feb 19 19:46:49 compute-0 ovn_controller[96653]: 2026-02-19T19:46:49Z|00188|binding|INFO|Setting lport 96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 up in Southbound
Feb 19 19:46:50 compute-0 nova_compute[186662]: 2026-02-19 19:46:50.700 186666 INFO nova.compute.manager [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Post operation of migration started
Feb 19 19:46:50 compute-0 nova_compute[186662]: 2026-02-19 19:46:50.700 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:50 compute-0 nova_compute[186662]: 2026-02-19 19:46:50.809 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:50 compute-0 nova_compute[186662]: 2026-02-19 19:46:50.810 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:50 compute-0 nova_compute[186662]: 2026-02-19 19:46:50.879 186666 DEBUG oslo_concurrency.lockutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-33129d90-e523-40cb-8bf0-73fb4cc7e32e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:46:50 compute-0 nova_compute[186662]: 2026-02-19 19:46:50.879 186666 DEBUG oslo_concurrency.lockutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-33129d90-e523-40cb-8bf0-73fb4cc7e32e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:46:50 compute-0 nova_compute[186662]: 2026-02-19 19:46:50.880 186666 DEBUG nova.network.neutron [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:46:51 compute-0 nova_compute[186662]: 2026-02-19 19:46:51.138 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:51 compute-0 nova_compute[186662]: 2026-02-19 19:46:51.311 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:51 compute-0 nova_compute[186662]: 2026-02-19 19:46:51.385 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:51 compute-0 nova_compute[186662]: 2026-02-19 19:46:51.719 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:52 compute-0 nova_compute[186662]: 2026-02-19 19:46:52.045 186666 DEBUG nova.network.neutron [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Updating instance_info_cache with network_info: [{"id": "96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3", "address": "fa:16:3e:08:46:4f", "network": {"id": "c593f4fa-8caf-4204-a168-7d36dea7afd9", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-530295078-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "494c6c3df70542bd8ad63a3ad2241fe8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96744eb1-bc", "ovs_interfaceid": "96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:46:52 compute-0 nova_compute[186662]: 2026-02-19 19:46:52.557 186666 DEBUG oslo_concurrency.lockutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-33129d90-e523-40cb-8bf0-73fb4cc7e32e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:46:53 compute-0 nova_compute[186662]: 2026-02-19 19:46:53.082 186666 DEBUG oslo_concurrency.lockutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:46:53 compute-0 nova_compute[186662]: 2026-02-19 19:46:53.083 186666 DEBUG oslo_concurrency.lockutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:46:53 compute-0 nova_compute[186662]: 2026-02-19 19:46:53.083 186666 DEBUG oslo_concurrency.lockutils [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:46:53 compute-0 nova_compute[186662]: 2026-02-19 19:46:53.086 186666 INFO nova.virt.libvirt.driver [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 19 19:46:53 compute-0 virtqemud[186157]: Domain id=17 name='instance-00000018' uuid=33129d90-e523-40cb-8bf0-73fb4cc7e32e is tainted: custom-monitor
Feb 19 19:46:54 compute-0 nova_compute[186662]: 2026-02-19 19:46:54.093 186666 INFO nova.virt.libvirt.driver [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 19 19:46:55 compute-0 nova_compute[186662]: 2026-02-19 19:46:55.099 186666 INFO nova.virt.libvirt.driver [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 19 19:46:55 compute-0 nova_compute[186662]: 2026-02-19 19:46:55.104 186666 DEBUG nova.compute.manager [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:46:55 compute-0 nova_compute[186662]: 2026-02-19 19:46:55.896 186666 DEBUG nova.objects.instance [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Feb 19 19:46:56 compute-0 nova_compute[186662]: 2026-02-19 19:46:56.175 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:56 compute-0 nova_compute[186662]: 2026-02-19 19:46:56.313 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:46:56 compute-0 sshd-session[216212]: Received disconnect from 103.67.78.251 port 57658:11: Bye Bye [preauth]
Feb 19 19:46:56 compute-0 sshd-session[216212]: Disconnected from authenticating user root 103.67.78.251 port 57658 [preauth]
Feb 19 19:46:56 compute-0 nova_compute[186662]: 2026-02-19 19:46:56.919 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:57 compute-0 nova_compute[186662]: 2026-02-19 19:46:57.028 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:57 compute-0 nova_compute[186662]: 2026-02-19 19:46:57.029 186666 WARNING neutronclient.v2_0.client [None req-2be95f8e-b30c-4c9e-ad0a-4ca691fb32a5 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:46:59 compute-0 podman[196025]: time="2026-02-19T19:46:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:46:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:46:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:46:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:46:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2664 "" "Go-http-client/1.1"
Feb 19 19:47:01 compute-0 nova_compute[186662]: 2026-02-19 19:47:01.177 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:01 compute-0 nova_compute[186662]: 2026-02-19 19:47:01.315 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:01 compute-0 openstack_network_exporter[198916]: ERROR   19:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:47:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:47:01 compute-0 openstack_network_exporter[198916]: ERROR   19:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:47:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:47:02 compute-0 podman[216214]: 2026-02-19 19:47:02.276348461 +0000 UTC m=+0.051394874 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 19 19:47:04 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:04.141 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:47:04 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:04.141 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:47:04 compute-0 nova_compute[186662]: 2026-02-19 19:47:04.142 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:06 compute-0 nova_compute[186662]: 2026-02-19 19:47:06.179 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:06 compute-0 podman[216235]: 2026-02-19 19:47:06.27553045 +0000 UTC m=+0.051697971 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1770267347, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 19 19:47:06 compute-0 nova_compute[186662]: 2026-02-19 19:47:06.317 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:07 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:07.145 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:47:08 compute-0 podman[216256]: 2026-02-19 19:47:08.305967684 +0000 UTC m=+0.072222268 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.251 186666 DEBUG oslo_concurrency.lockutils [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Acquiring lock "33129d90-e523-40cb-8bf0-73fb4cc7e32e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.251 186666 DEBUG oslo_concurrency.lockutils [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lock "33129d90-e523-40cb-8bf0-73fb4cc7e32e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.251 186666 DEBUG oslo_concurrency.lockutils [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Acquiring lock "33129d90-e523-40cb-8bf0-73fb4cc7e32e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.252 186666 DEBUG oslo_concurrency.lockutils [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lock "33129d90-e523-40cb-8bf0-73fb4cc7e32e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.252 186666 DEBUG oslo_concurrency.lockutils [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lock "33129d90-e523-40cb-8bf0-73fb4cc7e32e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.263 186666 INFO nova.compute.manager [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Terminating instance
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.777 186666 DEBUG nova.compute.manager [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:47:10 compute-0 kernel: tap96744eb1-bc (unregistering): left promiscuous mode
Feb 19 19:47:10 compute-0 NetworkManager[56519]: <info>  [1771530430.8104] device (tap96744eb1-bc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:47:10 compute-0 ovn_controller[96653]: 2026-02-19T19:47:10Z|00189|binding|INFO|Releasing lport 96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 from this chassis (sb_readonly=0)
Feb 19 19:47:10 compute-0 ovn_controller[96653]: 2026-02-19T19:47:10Z|00190|binding|INFO|Setting lport 96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 down in Southbound
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.817 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:10 compute-0 ovn_controller[96653]: 2026-02-19T19:47:10Z|00191|binding|INFO|Removing iface tap96744eb1-bc ovn-installed in OVS
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.818 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:10.826 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:46:4f 10.100.0.3'], port_security=['fa:16:3e:08:46:4f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '33129d90-e523-40cb-8bf0-73fb4cc7e32e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e7e566fe9744481b8016f2a804a68c2b', 'neutron:revision_number': '14', 'neutron:security_group_ids': '1f057594-2461-4678-b2b0-6a6c86e352f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a41cc354-2c45-4738-99b1-e4951b7f67ae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.826 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:10.827 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 in datapath c593f4fa-8caf-4204-a168-7d36dea7afd9 unbound from our chassis
Feb 19 19:47:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:10.828 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c593f4fa-8caf-4204-a168-7d36dea7afd9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:47:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:10.829 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[4ccf055d-9e4d-46fa-b078-354db84942d5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:47:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:10.830 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9 namespace which is not needed anymore
Feb 19 19:47:10 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000018.scope: Deactivated successfully.
Feb 19 19:47:10 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000018.scope: Consumed 2.142s CPU time.
Feb 19 19:47:10 compute-0 systemd-machined[156014]: Machine qemu-17-instance-00000018 terminated.
Feb 19 19:47:10 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[216183]: [NOTICE]   (216187) : haproxy version is 3.0.5-8e879a5
Feb 19 19:47:10 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[216183]: [NOTICE]   (216187) : path to executable is /usr/sbin/haproxy
Feb 19 19:47:10 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[216183]: [WARNING]  (216187) : Exiting Master process...
Feb 19 19:47:10 compute-0 podman[216308]: 2026-02-19 19:47:10.914000579 +0000 UTC m=+0.025587860 container kill aed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0)
Feb 19 19:47:10 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[216183]: [ALERT]    (216187) : Current worker (216189) exited with code 143 (Terminated)
Feb 19 19:47:10 compute-0 neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9[216183]: [WARNING]  (216187) : All workers exited. Exiting... (0)
Feb 19 19:47:10 compute-0 systemd[1]: libpod-aed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1.scope: Deactivated successfully.
Feb 19 19:47:10 compute-0 podman[216323]: 2026-02-19 19:47:10.945300163 +0000 UTC m=+0.016821321 container died aed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.950 186666 DEBUG nova.compute.manager [req-d02296f8-959e-4700-aa1a-31e9fe6f6fe2 req-84a4943c-c8b6-4c9f-b05b-36362764921a 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Received event network-vif-unplugged-96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.950 186666 DEBUG oslo_concurrency.lockutils [req-d02296f8-959e-4700-aa1a-31e9fe6f6fe2 req-84a4943c-c8b6-4c9f-b05b-36362764921a 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "33129d90-e523-40cb-8bf0-73fb4cc7e32e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.951 186666 DEBUG oslo_concurrency.lockutils [req-d02296f8-959e-4700-aa1a-31e9fe6f6fe2 req-84a4943c-c8b6-4c9f-b05b-36362764921a 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "33129d90-e523-40cb-8bf0-73fb4cc7e32e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.951 186666 DEBUG oslo_concurrency.lockutils [req-d02296f8-959e-4700-aa1a-31e9fe6f6fe2 req-84a4943c-c8b6-4c9f-b05b-36362764921a 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "33129d90-e523-40cb-8bf0-73fb4cc7e32e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.951 186666 DEBUG nova.compute.manager [req-d02296f8-959e-4700-aa1a-31e9fe6f6fe2 req-84a4943c-c8b6-4c9f-b05b-36362764921a 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] No waiting events found dispatching network-vif-unplugged-96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.951 186666 DEBUG nova.compute.manager [req-d02296f8-959e-4700-aa1a-31e9fe6f6fe2 req-84a4943c-c8b6-4c9f-b05b-36362764921a 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Received event network-vif-unplugged-96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:47:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1-userdata-shm.mount: Deactivated successfully.
Feb 19 19:47:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-f32445e587a5dcb835c1b74ca160e38f9b4d42b6b209f7f04712cbdbd1ceeb1f-merged.mount: Deactivated successfully.
Feb 19 19:47:10 compute-0 podman[216323]: 2026-02-19 19:47:10.970448512 +0000 UTC m=+0.041969670 container cleanup aed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest)
Feb 19 19:47:10 compute-0 systemd[1]: libpod-conmon-aed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1.scope: Deactivated successfully.
Feb 19 19:47:10 compute-0 podman[216325]: 2026-02-19 19:47:10.986656038 +0000 UTC m=+0.050464502 container remove aed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest)
Feb 19 19:47:10 compute-0 NetworkManager[56519]: <info>  [1771530430.9914] manager: (tap96744eb1-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.992 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:10 compute-0 nova_compute[186662]: 2026-02-19 19:47:10.995 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:11.002 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[dfdf8ca3-edac-422c-83ed-07abeae5467e]: (4, ("Thu Feb 19 07:47:10 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9 (aed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1)\naed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1\nThu Feb 19 07:47:10 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9 (aed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1)\naed4fb013ec37f62102ed9909f66c8d988b2f3a7dda2b44b4dc6b129c8f35df1\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:47:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:11.003 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[385fa5a5-42c1-419b-b4a7-ea2f05558c26]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:47:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:11.003 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c593f4fa-8caf-4204-a168-7d36dea7afd9.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:47:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:11.004 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[00d17299-00d6-49ee-9e8d-88968c38720f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:47:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:11.004 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc593f4fa-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.006 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:11 compute-0 kernel: tapc593f4fa-80: left promiscuous mode
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.013 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:11.015 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[06963e4e-94b8-4545-bf2d-48398c4c1efb]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:47:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:11.028 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[76e8d7ab-80d3-4db4-81b4-b5d4c245be8d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:47:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:11.028 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[077ddcf4-8940-4209-a844-17f53aa7a406]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.033 186666 INFO nova.virt.libvirt.driver [-] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Instance destroyed successfully.
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.034 186666 DEBUG nova.objects.instance [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lazy-loading 'resources' on Instance uuid 33129d90-e523-40cb-8bf0-73fb4cc7e32e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:47:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:11.039 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[394a3025-1a34-4e0c-811e-e80d2bd761ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467568, 'reachable_time': 35522, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216373, 'error': None, 'target': 'ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:47:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:11.041 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c593f4fa-8caf-4204-a168-7d36dea7afd9 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:47:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:11.041 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[f9c02aee-d5c3-41aa-9ef9-20d0a5b81b53]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:47:11 compute-0 systemd[1]: run-netns-ovnmeta\x2dc593f4fa\x2d8caf\x2d4204\x2da168\x2d7d36dea7afd9.mount: Deactivated successfully.
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.181 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.319 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.540 186666 DEBUG nova.virt.libvirt.vif [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-02-19T19:45:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-361117094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-361117094',id=24,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:45:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e7e566fe9744481b8016f2a804a68c2b',ramdisk_id='',reservation_id='r-qz508hpy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-285058964',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-285058964-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:46:56Z,user_data=None,user_id='eb789bc56fc941ed8761873b0815c588',uuid=33129d90-e523-40cb-8bf0-73fb4cc7e32e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3", "address": "fa:16:3e:08:46:4f", "network": {"id": "c593f4fa-8caf-4204-a168-7d36dea7afd9", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-530295078-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "494c6c3df70542bd8ad63a3ad2241fe8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96744eb1-bc", "ovs_interfaceid": "96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.540 186666 DEBUG nova.network.os_vif_util [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Converting VIF {"id": "96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3", "address": "fa:16:3e:08:46:4f", "network": {"id": "c593f4fa-8caf-4204-a168-7d36dea7afd9", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-530295078-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "494c6c3df70542bd8ad63a3ad2241fe8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96744eb1-bc", "ovs_interfaceid": "96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.541 186666 DEBUG nova.network.os_vif_util [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:08:46:4f,bridge_name='br-int',has_traffic_filtering=True,id=96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3,network=Network(c593f4fa-8caf-4204-a168-7d36dea7afd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96744eb1-bc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.541 186666 DEBUG os_vif [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:46:4f,bridge_name='br-int',has_traffic_filtering=True,id=96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3,network=Network(c593f4fa-8caf-4204-a168-7d36dea7afd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96744eb1-bc') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.542 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.543 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap96744eb1-bc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.544 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.545 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.546 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.546 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=42b2c406-231a-4b00-b071-954c4ba16e64) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.546 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.547 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.549 186666 INFO os_vif [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:46:4f,bridge_name='br-int',has_traffic_filtering=True,id=96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3,network=Network(c593f4fa-8caf-4204-a168-7d36dea7afd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96744eb1-bc')
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.549 186666 INFO nova.virt.libvirt.driver [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Deleting instance files /var/lib/nova/instances/33129d90-e523-40cb-8bf0-73fb4cc7e32e_del
Feb 19 19:47:11 compute-0 nova_compute[186662]: 2026-02-19 19:47:11.550 186666 INFO nova.virt.libvirt.driver [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Deletion of /var/lib/nova/instances/33129d90-e523-40cb-8bf0-73fb4cc7e32e_del complete
Feb 19 19:47:12 compute-0 nova_compute[186662]: 2026-02-19 19:47:12.060 186666 INFO nova.compute.manager [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Took 1.28 seconds to destroy the instance on the hypervisor.
Feb 19 19:47:12 compute-0 nova_compute[186662]: 2026-02-19 19:47:12.060 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:47:12 compute-0 nova_compute[186662]: 2026-02-19 19:47:12.060 186666 DEBUG nova.compute.manager [-] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:47:12 compute-0 nova_compute[186662]: 2026-02-19 19:47:12.060 186666 DEBUG nova.network.neutron [-] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:47:12 compute-0 nova_compute[186662]: 2026-02-19 19:47:12.060 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:47:12 compute-0 podman[216374]: 2026-02-19 19:47:12.271514334 +0000 UTC m=+0.045917502 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:47:13 compute-0 nova_compute[186662]: 2026-02-19 19:47:13.000 186666 DEBUG nova.compute.manager [req-9ae094fa-63e0-4c1f-bc8c-d95b841d252f req-b5e019d2-f505-41a7-91bf-995052ded6b2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Received event network-vif-unplugged-96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:47:13 compute-0 nova_compute[186662]: 2026-02-19 19:47:13.001 186666 DEBUG oslo_concurrency.lockutils [req-9ae094fa-63e0-4c1f-bc8c-d95b841d252f req-b5e019d2-f505-41a7-91bf-995052ded6b2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "33129d90-e523-40cb-8bf0-73fb4cc7e32e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:47:13 compute-0 nova_compute[186662]: 2026-02-19 19:47:13.001 186666 DEBUG oslo_concurrency.lockutils [req-9ae094fa-63e0-4c1f-bc8c-d95b841d252f req-b5e019d2-f505-41a7-91bf-995052ded6b2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "33129d90-e523-40cb-8bf0-73fb4cc7e32e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:47:13 compute-0 nova_compute[186662]: 2026-02-19 19:47:13.001 186666 DEBUG oslo_concurrency.lockutils [req-9ae094fa-63e0-4c1f-bc8c-d95b841d252f req-b5e019d2-f505-41a7-91bf-995052ded6b2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "33129d90-e523-40cb-8bf0-73fb4cc7e32e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:47:13 compute-0 nova_compute[186662]: 2026-02-19 19:47:13.001 186666 DEBUG nova.compute.manager [req-9ae094fa-63e0-4c1f-bc8c-d95b841d252f req-b5e019d2-f505-41a7-91bf-995052ded6b2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] No waiting events found dispatching network-vif-unplugged-96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:47:13 compute-0 nova_compute[186662]: 2026-02-19 19:47:13.001 186666 DEBUG nova.compute.manager [req-9ae094fa-63e0-4c1f-bc8c-d95b841d252f req-b5e019d2-f505-41a7-91bf-995052ded6b2 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Received event network-vif-unplugged-96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:47:13 compute-0 nova_compute[186662]: 2026-02-19 19:47:13.049 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:47:14 compute-0 nova_compute[186662]: 2026-02-19 19:47:14.014 186666 DEBUG nova.network.neutron [-] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:47:14 compute-0 nova_compute[186662]: 2026-02-19 19:47:14.521 186666 INFO nova.compute.manager [-] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Took 2.46 seconds to deallocate network for instance.
Feb 19 19:47:14 compute-0 sshd-session[216398]: Invalid user dev from 96.78.175.42 port 44180
Feb 19 19:47:14 compute-0 sshd-session[216398]: Received disconnect from 96.78.175.42 port 44180:11: Bye Bye [preauth]
Feb 19 19:47:14 compute-0 sshd-session[216398]: Disconnected from invalid user dev 96.78.175.42 port 44180 [preauth]
Feb 19 19:47:15 compute-0 nova_compute[186662]: 2026-02-19 19:47:15.037 186666 DEBUG oslo_concurrency.lockutils [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:47:15 compute-0 nova_compute[186662]: 2026-02-19 19:47:15.037 186666 DEBUG oslo_concurrency.lockutils [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:47:15 compute-0 nova_compute[186662]: 2026-02-19 19:47:15.042 186666 DEBUG oslo_concurrency.lockutils [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:47:15 compute-0 nova_compute[186662]: 2026-02-19 19:47:15.058 186666 DEBUG nova.compute.manager [req-5ba3b2ad-cf7e-4cbe-93d4-8f8c1807e78b req-54313e7c-a77e-4b12-a5be-03a85995b876 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 33129d90-e523-40cb-8bf0-73fb4cc7e32e] Received event network-vif-deleted-96744eb1-bc9c-4fb1-bf76-6f6dfd383aa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:47:15 compute-0 nova_compute[186662]: 2026-02-19 19:47:15.073 186666 INFO nova.scheduler.client.report [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Deleted allocations for instance 33129d90-e523-40cb-8bf0-73fb4cc7e32e
Feb 19 19:47:16 compute-0 nova_compute[186662]: 2026-02-19 19:47:16.099 186666 DEBUG oslo_concurrency.lockutils [None req-f98e69a3-3a6c-4650-b9e9-e051c635e821 eb789bc56fc941ed8761873b0815c588 e7e566fe9744481b8016f2a804a68c2b - - default default] Lock "33129d90-e523-40cb-8bf0-73fb4cc7e32e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.848s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:47:16 compute-0 nova_compute[186662]: 2026-02-19 19:47:16.183 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:16 compute-0 nova_compute[186662]: 2026-02-19 19:47:16.547 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:20 compute-0 sshd-session[216401]: Invalid user teamspeak3 from 106.51.64.128 port 46704
Feb 19 19:47:20 compute-0 sshd-session[216401]: Received disconnect from 106.51.64.128 port 46704:11: Bye Bye [preauth]
Feb 19 19:47:20 compute-0 sshd-session[216401]: Disconnected from invalid user teamspeak3 106.51.64.128 port 46704 [preauth]
Feb 19 19:47:21 compute-0 nova_compute[186662]: 2026-02-19 19:47:21.186 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:21 compute-0 nova_compute[186662]: 2026-02-19 19:47:21.550 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:24 compute-0 nova_compute[186662]: 2026-02-19 19:47:24.106 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:47:26 compute-0 nova_compute[186662]: 2026-02-19 19:47:26.188 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:26 compute-0 nova_compute[186662]: 2026-02-19 19:47:26.551 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:27 compute-0 nova_compute[186662]: 2026-02-19 19:47:27.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:47:29 compute-0 nova_compute[186662]: 2026-02-19 19:47:29.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:47:29 compute-0 podman[196025]: time="2026-02-19T19:47:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:47:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:47:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:47:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:47:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2196 "" "Go-http-client/1.1"
Feb 19 19:47:31 compute-0 nova_compute[186662]: 2026-02-19 19:47:31.193 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:31 compute-0 openstack_network_exporter[198916]: ERROR   19:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:47:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:47:31 compute-0 openstack_network_exporter[198916]: ERROR   19:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:47:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:47:31 compute-0 nova_compute[186662]: 2026-02-19 19:47:31.553 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:31 compute-0 nova_compute[186662]: 2026-02-19 19:47:31.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:47:31 compute-0 nova_compute[186662]: 2026-02-19 19:47:31.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:47:31 compute-0 nova_compute[186662]: 2026-02-19 19:47:31.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:47:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:32.155 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:47:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:32.155 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:47:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:32.155 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:47:33 compute-0 podman[216404]: 2026-02-19 19:47:33.298433104 +0000 UTC m=+0.067882625 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 19:47:33 compute-0 nova_compute[186662]: 2026-02-19 19:47:33.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:47:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:33.756 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:02:e4 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fecbde57-58e9-4df0-aab7-14888d1477cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '09b52bffe1f548839b74e94d80ad9eb7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b9fd371-626a-4670-b24e-8264e7c1e410, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5190b70d-a81e-4df1-b581-b1a2cfd96252) old=Port_Binding(mac=['fa:16:3e:f7:02:e4'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fecbde57-58e9-4df0-aab7-14888d1477cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '09b52bffe1f548839b74e94d80ad9eb7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:47:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:33.758 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5190b70d-a81e-4df1-b581-b1a2cfd96252 in datapath fecbde57-58e9-4df0-aab7-14888d1477cc updated
Feb 19 19:47:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:33.759 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fecbde57-58e9-4df0-aab7-14888d1477cc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:47:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:33.761 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[080aa67e-bb5d-4a3d-897f-6c0fc13ea033]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:47:34 compute-0 nova_compute[186662]: 2026-02-19 19:47:34.081 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:47:34 compute-0 nova_compute[186662]: 2026-02-19 19:47:34.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:47:35 compute-0 nova_compute[186662]: 2026-02-19 19:47:35.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:47:35 compute-0 nova_compute[186662]: 2026-02-19 19:47:35.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:47:35 compute-0 nova_compute[186662]: 2026-02-19 19:47:35.091 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:47:35 compute-0 nova_compute[186662]: 2026-02-19 19:47:35.091 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:47:35 compute-0 nova_compute[186662]: 2026-02-19 19:47:35.213 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:47:35 compute-0 nova_compute[186662]: 2026-02-19 19:47:35.215 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:47:35 compute-0 nova_compute[186662]: 2026-02-19 19:47:35.230 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:47:35 compute-0 nova_compute[186662]: 2026-02-19 19:47:35.230 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5819MB free_disk=72.96762466430664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:47:35 compute-0 nova_compute[186662]: 2026-02-19 19:47:35.231 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:47:35 compute-0 nova_compute[186662]: 2026-02-19 19:47:35.231 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:47:36 compute-0 nova_compute[186662]: 2026-02-19 19:47:36.194 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:36 compute-0 nova_compute[186662]: 2026-02-19 19:47:36.267 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:47:36 compute-0 nova_compute[186662]: 2026-02-19 19:47:36.267 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:47:35 up  1:18,  0 user,  load average: 0.09, 0.15, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:47:36 compute-0 nova_compute[186662]: 2026-02-19 19:47:36.458 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:47:36 compute-0 nova_compute[186662]: 2026-02-19 19:47:36.588 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:36 compute-0 nova_compute[186662]: 2026-02-19 19:47:36.966 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:47:37 compute-0 podman[216426]: 2026-02-19 19:47:37.300574246 +0000 UTC m=+0.071463871 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, vcs-type=git)
Feb 19 19:47:37 compute-0 nova_compute[186662]: 2026-02-19 19:47:37.475 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:47:37 compute-0 nova_compute[186662]: 2026-02-19 19:47:37.475 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.244s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:47:38 compute-0 nova_compute[186662]: 2026-02-19 19:47:38.474 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:47:39 compute-0 podman[216447]: 2026-02-19 19:47:39.29949507 +0000 UTC m=+0.075929377 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 19 19:47:41 compute-0 nova_compute[186662]: 2026-02-19 19:47:41.232 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:41 compute-0 nova_compute[186662]: 2026-02-19 19:47:41.590 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:43 compute-0 podman[216474]: 2026-02-19 19:47:43.301293252 +0000 UTC m=+0.078139659 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 19:47:43 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:43.354 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:1d:c2 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0e146a26-0758-47c8-af70-e817f5b2de0e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e146a26-0758-47c8-af70-e817f5b2de0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660950d25a914555816074d9de961374', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68bd134a-4cef-45f2-9ef0-fc3a7016c2b5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bd903cb3-a209-443c-bdb5-907abfd8bc79) old=Port_Binding(mac=['fa:16:3e:36:1d:c2'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-0e146a26-0758-47c8-af70-e817f5b2de0e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e146a26-0758-47c8-af70-e817f5b2de0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660950d25a914555816074d9de961374', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:47:43 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:43.356 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bd903cb3-a209-443c-bdb5-907abfd8bc79 in datapath 0e146a26-0758-47c8-af70-e817f5b2de0e updated
Feb 19 19:47:43 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:43.357 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0e146a26-0758-47c8-af70-e817f5b2de0e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:47:43 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:47:43.358 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[fcedc6af-fee5-486a-bc35-a2829469246f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:47:46 compute-0 nova_compute[186662]: 2026-02-19 19:47:46.233 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:46 compute-0 nova_compute[186662]: 2026-02-19 19:47:46.593 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:51 compute-0 nova_compute[186662]: 2026-02-19 19:47:51.234 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:51 compute-0 nova_compute[186662]: 2026-02-19 19:47:51.595 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:55 compute-0 ovn_controller[96653]: 2026-02-19T19:47:55Z|00192|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 19 19:47:56 compute-0 nova_compute[186662]: 2026-02-19 19:47:56.239 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:56 compute-0 nova_compute[186662]: 2026-02-19 19:47:56.597 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:47:59 compute-0 podman[196025]: time="2026-02-19T19:47:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:47:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:47:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:47:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:47:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Feb 19 19:48:01 compute-0 nova_compute[186662]: 2026-02-19 19:48:01.240 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:01 compute-0 openstack_network_exporter[198916]: ERROR   19:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:48:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:48:01 compute-0 openstack_network_exporter[198916]: ERROR   19:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:48:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:48:01 compute-0 nova_compute[186662]: 2026-02-19 19:48:01.598 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:04 compute-0 podman[216500]: 2026-02-19 19:48:04.26847797 +0000 UTC m=+0.048875713 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Feb 19 19:48:06 compute-0 nova_compute[186662]: 2026-02-19 19:48:06.242 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:06 compute-0 nova_compute[186662]: 2026-02-19 19:48:06.633 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:08 compute-0 podman[216520]: 2026-02-19 19:48:08.264810923 +0000 UTC m=+0.044575241 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Feb 19 19:48:10 compute-0 podman[216542]: 2026-02-19 19:48:10.315721423 +0000 UTC m=+0.096591017 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0)
Feb 19 19:48:11 compute-0 nova_compute[186662]: 2026-02-19 19:48:11.243 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:11 compute-0 nova_compute[186662]: 2026-02-19 19:48:11.635 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:14 compute-0 podman[216568]: 2026-02-19 19:48:14.259227541 +0000 UTC m=+0.040011064 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:48:16 compute-0 nova_compute[186662]: 2026-02-19 19:48:16.244 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:16 compute-0 nova_compute[186662]: 2026-02-19 19:48:16.637 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:19 compute-0 sshd-session[216593]: Received disconnect from 45.169.200.254 port 45870:11: Bye Bye [preauth]
Feb 19 19:48:19 compute-0 sshd-session[216593]: Disconnected from authenticating user root 45.169.200.254 port 45870 [preauth]
Feb 19 19:48:21 compute-0 nova_compute[186662]: 2026-02-19 19:48:21.245 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:21 compute-0 nova_compute[186662]: 2026-02-19 19:48:21.638 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:48:22.103 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:48:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:48:22.103 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:48:22 compute-0 nova_compute[186662]: 2026-02-19 19:48:22.104 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:24 compute-0 sshd-session[216597]: Invalid user tpaterni from 197.211.55.20 port 55580
Feb 19 19:48:25 compute-0 sshd-session[216597]: Received disconnect from 197.211.55.20 port 55580:11: Bye Bye [preauth]
Feb 19 19:48:25 compute-0 sshd-session[216597]: Disconnected from invalid user tpaterni 197.211.55.20 port 55580 [preauth]
Feb 19 19:48:25 compute-0 nova_compute[186662]: 2026-02-19 19:48:25.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:48:26 compute-0 nova_compute[186662]: 2026-02-19 19:48:26.247 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:26 compute-0 nova_compute[186662]: 2026-02-19 19:48:26.641 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:29 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:48:29.105 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:48:29 compute-0 nova_compute[186662]: 2026-02-19 19:48:29.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:48:29 compute-0 nova_compute[186662]: 2026-02-19 19:48:29.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:48:29 compute-0 podman[196025]: time="2026-02-19T19:48:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:48:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:48:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:48:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:48:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Feb 19 19:48:31 compute-0 nova_compute[186662]: 2026-02-19 19:48:31.248 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:31 compute-0 openstack_network_exporter[198916]: ERROR   19:48:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:48:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:48:31 compute-0 openstack_network_exporter[198916]: ERROR   19:48:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:48:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:48:31 compute-0 nova_compute[186662]: 2026-02-19 19:48:31.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:48:31 compute-0 nova_compute[186662]: 2026-02-19 19:48:31.576 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:48:31 compute-0 nova_compute[186662]: 2026-02-19 19:48:31.643 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:48:32.157 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:48:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:48:32.157 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:48:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:48:32.157 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:48:32 compute-0 nova_compute[186662]: 2026-02-19 19:48:32.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:48:34 compute-0 nova_compute[186662]: 2026-02-19 19:48:34.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:48:34 compute-0 nova_compute[186662]: 2026-02-19 19:48:34.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:48:35 compute-0 nova_compute[186662]: 2026-02-19 19:48:35.084 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:48:35 compute-0 nova_compute[186662]: 2026-02-19 19:48:35.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:48:35 compute-0 nova_compute[186662]: 2026-02-19 19:48:35.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:48:35 compute-0 nova_compute[186662]: 2026-02-19 19:48:35.085 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:48:35 compute-0 nova_compute[186662]: 2026-02-19 19:48:35.202 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:48:35 compute-0 nova_compute[186662]: 2026-02-19 19:48:35.203 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:48:35 compute-0 nova_compute[186662]: 2026-02-19 19:48:35.220 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:48:35 compute-0 nova_compute[186662]: 2026-02-19 19:48:35.221 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5818MB free_disk=72.96762466430664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:48:35 compute-0 nova_compute[186662]: 2026-02-19 19:48:35.221 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:48:35 compute-0 nova_compute[186662]: 2026-02-19 19:48:35.221 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:48:35 compute-0 podman[216600]: 2026-02-19 19:48:35.293553267 +0000 UTC m=+0.073536380 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Feb 19 19:48:36 compute-0 nova_compute[186662]: 2026-02-19 19:48:36.250 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:36 compute-0 nova_compute[186662]: 2026-02-19 19:48:36.274 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:48:36 compute-0 nova_compute[186662]: 2026-02-19 19:48:36.274 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:48:35 up  1:19,  0 user,  load average: 0.07, 0.13, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:48:36 compute-0 nova_compute[186662]: 2026-02-19 19:48:36.291 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:48:36 compute-0 nova_compute[186662]: 2026-02-19 19:48:36.645 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:36 compute-0 nova_compute[186662]: 2026-02-19 19:48:36.812 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:48:37 compute-0 nova_compute[186662]: 2026-02-19 19:48:37.323 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:48:37 compute-0 nova_compute[186662]: 2026-02-19 19:48:37.324 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:48:39 compute-0 podman[216620]: 2026-02-19 19:48:39.272453946 +0000 UTC m=+0.046793315 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, release=1770267347, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.7, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 19 19:48:40 compute-0 nova_compute[186662]: 2026-02-19 19:48:40.324 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:48:41 compute-0 nova_compute[186662]: 2026-02-19 19:48:41.251 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:41 compute-0 podman[216641]: 2026-02-19 19:48:41.30578796 +0000 UTC m=+0.086957190 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Feb 19 19:48:41 compute-0 nova_compute[186662]: 2026-02-19 19:48:41.646 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:45 compute-0 podman[216667]: 2026-02-19 19:48:45.284531285 +0000 UTC m=+0.065700414 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:48:46 compute-0 nova_compute[186662]: 2026-02-19 19:48:46.253 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:46 compute-0 nova_compute[186662]: 2026-02-19 19:48:46.647 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:48 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 19 19:48:51 compute-0 nova_compute[186662]: 2026-02-19 19:48:51.255 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:51 compute-0 nova_compute[186662]: 2026-02-19 19:48:51.649 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:56 compute-0 nova_compute[186662]: 2026-02-19 19:48:56.529 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:56 compute-0 nova_compute[186662]: 2026-02-19 19:48:56.532 186666 DEBUG nova.virt.libvirt.driver [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Creating tmpfile /var/lib/nova/instances/tmpxw00s5wo to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Feb 19 19:48:56 compute-0 nova_compute[186662]: 2026-02-19 19:48:56.532 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:48:56 compute-0 nova_compute[186662]: 2026-02-19 19:48:56.533 186666 DEBUG nova.virt.libvirt.driver [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Creating tmpfile /var/lib/nova/instances/tmpdpiz3s_n to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Feb 19 19:48:56 compute-0 nova_compute[186662]: 2026-02-19 19:48:56.534 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:48:56 compute-0 nova_compute[186662]: 2026-02-19 19:48:56.537 186666 DEBUG nova.compute.manager [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxw00s5wo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Feb 19 19:48:56 compute-0 nova_compute[186662]: 2026-02-19 19:48:56.539 186666 DEBUG nova.compute.manager [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdpiz3s_n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Feb 19 19:48:56 compute-0 nova_compute[186662]: 2026-02-19 19:48:56.651 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:48:59 compute-0 nova_compute[186662]: 2026-02-19 19:48:59.077 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:48:59 compute-0 nova_compute[186662]: 2026-02-19 19:48:59.580 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:48:59 compute-0 podman[196025]: time="2026-02-19T19:48:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:48:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:48:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:48:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:48:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2196 "" "Go-http-client/1.1"
Feb 19 19:49:01 compute-0 nova_compute[186662]: 2026-02-19 19:49:01.258 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:01 compute-0 openstack_network_exporter[198916]: ERROR   19:49:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:49:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:49:01 compute-0 openstack_network_exporter[198916]: ERROR   19:49:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:49:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:49:01 compute-0 nova_compute[186662]: 2026-02-19 19:49:01.653 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:04 compute-0 nova_compute[186662]: 2026-02-19 19:49:04.746 186666 DEBUG nova.compute.manager [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdpiz3s_n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7238b307-58b2-4472-a465-02c1d04441b1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Feb 19 19:49:05 compute-0 nova_compute[186662]: 2026-02-19 19:49:05.765 186666 DEBUG oslo_concurrency.lockutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-7238b307-58b2-4472-a465-02c1d04441b1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:49:05 compute-0 nova_compute[186662]: 2026-02-19 19:49:05.765 186666 DEBUG oslo_concurrency.lockutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-7238b307-58b2-4472-a465-02c1d04441b1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:49:05 compute-0 nova_compute[186662]: 2026-02-19 19:49:05.765 186666 DEBUG nova.network.neutron [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:49:06 compute-0 nova_compute[186662]: 2026-02-19 19:49:06.259 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:06 compute-0 podman[216693]: 2026-02-19 19:49:06.270561444 +0000 UTC m=+0.045677917 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:49:06 compute-0 nova_compute[186662]: 2026-02-19 19:49:06.271 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:06 compute-0 nova_compute[186662]: 2026-02-19 19:49:06.654 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:07 compute-0 sshd-session[216712]: Invalid user admin from 189.165.79.177 port 60100
Feb 19 19:49:07 compute-0 sshd-session[216712]: Received disconnect from 189.165.79.177 port 60100:11: Bye Bye [preauth]
Feb 19 19:49:07 compute-0 sshd-session[216712]: Disconnected from invalid user admin 189.165.79.177 port 60100 [preauth]
Feb 19 19:49:08 compute-0 nova_compute[186662]: 2026-02-19 19:49:08.469 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:09 compute-0 nova_compute[186662]: 2026-02-19 19:49:09.375 186666 DEBUG nova.network.neutron [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Updating instance_info_cache with network_info: [{"id": "29e10756-b344-4f35-97f0-615406e5c619", "address": "fa:16:3e:a1:c3:d3", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e10756-b3", "ovs_interfaceid": "29e10756-b344-4f35-97f0-615406e5c619", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:49:09 compute-0 nova_compute[186662]: 2026-02-19 19:49:09.883 186666 DEBUG oslo_concurrency.lockutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-7238b307-58b2-4472-a465-02c1d04441b1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:49:09 compute-0 nova_compute[186662]: 2026-02-19 19:49:09.899 186666 DEBUG nova.virt.libvirt.driver [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdpiz3s_n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7238b307-58b2-4472-a465-02c1d04441b1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Feb 19 19:49:09 compute-0 nova_compute[186662]: 2026-02-19 19:49:09.900 186666 DEBUG nova.virt.libvirt.driver [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Creating instance directory: /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Feb 19 19:49:09 compute-0 nova_compute[186662]: 2026-02-19 19:49:09.900 186666 DEBUG nova.virt.libvirt.driver [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Creating disk.info with the contents: {'/var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk': 'qcow2', '/var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Feb 19 19:49:09 compute-0 nova_compute[186662]: 2026-02-19 19:49:09.901 186666 DEBUG nova.virt.libvirt.driver [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Feb 19 19:49:09 compute-0 nova_compute[186662]: 2026-02-19 19:49:09.901 186666 DEBUG nova.objects.instance [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7238b307-58b2-4472-a465-02c1d04441b1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:49:10 compute-0 podman[216714]: 2026-02-19 19:49:10.262316748 +0000 UTC m=+0.043740911 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.413 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.416 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.418 186666 DEBUG oslo_concurrency.processutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.461 186666 DEBUG oslo_concurrency.processutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.462 186666 DEBUG oslo_concurrency.lockutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.463 186666 DEBUG oslo_concurrency.lockutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.463 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.466 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.466 186666 DEBUG oslo_concurrency.processutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.508 186666 DEBUG oslo_concurrency.processutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.509 186666 DEBUG oslo_concurrency.processutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.532 186666 DEBUG oslo_concurrency.processutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.533 186666 DEBUG oslo_concurrency.lockutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.070s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.534 186666 DEBUG oslo_concurrency.processutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.573 186666 DEBUG oslo_concurrency.processutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.574 186666 DEBUG nova.virt.disk.api [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Checking if we can resize image /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.574 186666 DEBUG oslo_concurrency.processutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.616 186666 DEBUG oslo_concurrency.processutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.617 186666 DEBUG nova.virt.disk.api [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Cannot resize image /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:49:10 compute-0 nova_compute[186662]: 2026-02-19 19:49:10.617 186666 DEBUG nova.objects.instance [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid 7238b307-58b2-4472-a465-02c1d04441b1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.122 186666 DEBUG nova.objects.base [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<7238b307-58b2-4472-a465-02c1d04441b1> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.123 186666 DEBUG oslo_concurrency.processutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.137 186666 DEBUG oslo_concurrency.processutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk.config 497664" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.138 186666 DEBUG nova.virt.libvirt.driver [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.139 186666 DEBUG nova.virt.libvirt.vif [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-02-19T19:48:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1378944667',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1378944',id=27,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:48:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='660950d25a914555816074d9de961374',ramdisk_id='',reservation_id='r-1q8i74ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:48:31Z,user_data=None,user_id='a7713a2fa0c241d2a75d4f3d928cdc1f',uuid=7238b307-58b2-4472-a465-02c1d04441b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29e10756-b344-4f35-97f0-615406e5c619", "address": "fa:16:3e:a1:c3:d3", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap29e10756-b3", "ovs_interfaceid": "29e10756-b344-4f35-97f0-615406e5c619", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.139 186666 DEBUG nova.network.os_vif_util [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "29e10756-b344-4f35-97f0-615406e5c619", "address": "fa:16:3e:a1:c3:d3", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap29e10756-b3", "ovs_interfaceid": "29e10756-b344-4f35-97f0-615406e5c619", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.140 186666 DEBUG nova.network.os_vif_util [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:c3:d3,bridge_name='br-int',has_traffic_filtering=True,id=29e10756-b344-4f35-97f0-615406e5c619,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e10756-b3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.141 186666 DEBUG os_vif [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:c3:d3,bridge_name='br-int',has_traffic_filtering=True,id=29e10756-b344-4f35-97f0-615406e5c619,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e10756-b3') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.141 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.142 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.142 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.143 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.143 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '05a5ed07-e6fa-5c83-bf91-8a5ac5e3c58a', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.145 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.147 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.150 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.150 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29e10756-b3, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.151 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap29e10756-b3, col_values=(('qos', UUID('ecabef82-91ea-445d-b55a-a6f8a416c186')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.151 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap29e10756-b3, col_values=(('external_ids', {'iface-id': '29e10756-b344-4f35-97f0-615406e5c619', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a1:c3:d3', 'vm-uuid': '7238b307-58b2-4472-a465-02c1d04441b1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.153 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:11 compute-0 NetworkManager[56519]: <info>  [1771530551.1536] manager: (tap29e10756-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.156 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.158 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.158 186666 INFO os_vif [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:c3:d3,bridge_name='br-int',has_traffic_filtering=True,id=29e10756-b344-4f35-97f0-615406e5c619,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e10756-b3')
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.159 186666 DEBUG nova.virt.libvirt.driver [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.159 186666 DEBUG nova.compute.manager [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdpiz3s_n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7238b307-58b2-4472-a465-02c1d04441b1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.160 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.249 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:11 compute-0 nova_compute[186662]: 2026-02-19 19:49:11.260 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:12 compute-0 podman[216755]: 2026-02-19 19:49:12.283249916 +0000 UTC m=+0.059769273 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:49:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:13.273 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:49:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:13.274 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:49:13 compute-0 nova_compute[186662]: 2026-02-19 19:49:13.312 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:15 compute-0 nova_compute[186662]: 2026-02-19 19:49:15.448 186666 DEBUG nova.network.neutron [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Port 29e10756-b344-4f35-97f0-615406e5c619 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Feb 19 19:49:15 compute-0 nova_compute[186662]: 2026-02-19 19:49:15.464 186666 DEBUG nova.compute.manager [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdpiz3s_n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7238b307-58b2-4472-a465-02c1d04441b1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Feb 19 19:49:16 compute-0 nova_compute[186662]: 2026-02-19 19:49:16.166 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:16 compute-0 nova_compute[186662]: 2026-02-19 19:49:16.261 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:16 compute-0 podman[216782]: 2026-02-19 19:49:16.283561304 +0000 UTC m=+0.064787343 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:49:17 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 19 19:49:17 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 19 19:49:17 compute-0 kernel: tap29e10756-b3: entered promiscuous mode
Feb 19 19:49:17 compute-0 NetworkManager[56519]: <info>  [1771530557.9623] manager: (tap29e10756-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/77)
Feb 19 19:49:17 compute-0 nova_compute[186662]: 2026-02-19 19:49:17.962 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:17 compute-0 ovn_controller[96653]: 2026-02-19T19:49:17Z|00193|binding|INFO|Claiming lport 29e10756-b344-4f35-97f0-615406e5c619 for this additional chassis.
Feb 19 19:49:17 compute-0 ovn_controller[96653]: 2026-02-19T19:49:17Z|00194|binding|INFO|29e10756-b344-4f35-97f0-615406e5c619: Claiming fa:16:3e:a1:c3:d3 10.100.0.11
Feb 19 19:49:17 compute-0 nova_compute[186662]: 2026-02-19 19:49:17.964 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:17 compute-0 nova_compute[186662]: 2026-02-19 19:49:17.968 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:17.975 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:c3:d3 10.100.0.11'], port_security=['fa:16:3e:a1:c3:d3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '7238b307-58b2-4472-a465-02c1d04441b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fecbde57-58e9-4df0-aab7-14888d1477cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660950d25a914555816074d9de961374', 'neutron:revision_number': '10', 'neutron:security_group_ids': '479d32ff-921d-41a5-a067-ea1f3b43c40b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b9fd371-626a-4670-b24e-8264e7c1e410, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=29e10756-b344-4f35-97f0-615406e5c619) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:49:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:17.976 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 29e10756-b344-4f35-97f0-615406e5c619 in datapath fecbde57-58e9-4df0-aab7-14888d1477cc unbound from our chassis
Feb 19 19:49:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:17.977 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fecbde57-58e9-4df0-aab7-14888d1477cc
Feb 19 19:49:17 compute-0 nova_compute[186662]: 2026-02-19 19:49:17.984 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:17 compute-0 ovn_controller[96653]: 2026-02-19T19:49:17Z|00195|binding|INFO|Setting lport 29e10756-b344-4f35-97f0-615406e5c619 ovn-installed in OVS
Feb 19 19:49:17 compute-0 nova_compute[186662]: 2026-02-19 19:49:17.988 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:17 compute-0 systemd-machined[156014]: New machine qemu-18-instance-0000001b.
Feb 19 19:49:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:17.988 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ee9ca5-8b1b-4360-b3ee-e2a5f8e9f67d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:17.989 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfecbde57-51 in ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:49:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:17.991 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfecbde57-50 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:49:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:17.991 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[807b157b-5ca1-4628-9700-bedb691ab400]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:17 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:17.992 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ecbdac67-e900-41aa-979b-3a94299821bd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:17 compute-0 systemd-udevd[216843]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:49:17 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-0000001b.
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:17.999 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[aeb305b7-b7ca-4226-8cd6-05abd9a1a857]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 NetworkManager[56519]: <info>  [1771530558.0044] device (tap29e10756-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:49:18 compute-0 NetworkManager[56519]: <info>  [1771530558.0048] device (tap29e10756-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.005 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[78f2f482-76e6-4bb7-bfc2-3c90b4c9f649]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.032 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[923d28d3-0ac6-44ae-98ac-bdf2fa80ef22]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.036 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b6ace104-056a-48bf-adc5-81a15fd85f5a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 NetworkManager[56519]: <info>  [1771530558.0375] manager: (tapfecbde57-50): new Veth device (/org/freedesktop/NetworkManager/Devices/78)
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.063 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[2467c3e1-5e8c-489f-b3b5-4a887a6a1640]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.077 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[95c4fe58-16fd-4311-bd8b-0bfa59545bab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 NetworkManager[56519]: <info>  [1771530558.0982] device (tapfecbde57-50): carrier: link connected
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.102 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[c411b4ae-2c92-431e-85fa-95ed868d4050]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.113 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[bee17f03-8c7d-4293-8866-014a3dc977eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfecbde57-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:02:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482766, 'reachable_time': 41879, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216875, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.123 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b6b6cec7-1556-4e78-b77c-330458dc3c45]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:2e4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482766, 'tstamp': 482766}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216876, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.135 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7f0e9780-856f-411b-9d0b-4c9ad3f30aad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfecbde57-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:02:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482766, 'reachable_time': 41879, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216877, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.155 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[14aa8f6e-fa94-4fca-bd43-865ad9d5fe53]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.199 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[eec8f850-6de0-4f94-b8d9-d3f8ca6af097]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.201 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfecbde57-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.201 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.201 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfecbde57-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:18 compute-0 nova_compute[186662]: 2026-02-19 19:49:18.203 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:18 compute-0 NetworkManager[56519]: <info>  [1771530558.2042] manager: (tapfecbde57-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Feb 19 19:49:18 compute-0 kernel: tapfecbde57-50: entered promiscuous mode
Feb 19 19:49:18 compute-0 nova_compute[186662]: 2026-02-19 19:49:18.206 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.214 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfecbde57-50, col_values=(('external_ids', {'iface-id': '5190b70d-a81e-4df1-b581-b1a2cfd96252'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:18 compute-0 nova_compute[186662]: 2026-02-19 19:49:18.216 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:18 compute-0 nova_compute[186662]: 2026-02-19 19:49:18.217 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:18 compute-0 ovn_controller[96653]: 2026-02-19T19:49:18Z|00196|binding|INFO|Releasing lport 5190b70d-a81e-4df1-b581-b1a2cfd96252 from this chassis (sb_readonly=0)
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.224 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[705dbfe7-0cc4-415a-9a4a-78b3235cbbb3]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.225 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.225 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.225 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for fecbde57-58e9-4df0-aab7-14888d1477cc disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.225 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.226 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a7fe861a-3a8b-41e8-86e2-613185ffe5f6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.226 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.226 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[cf566ae4-8f6c-4f18-b9c2-688e399fec03]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.227 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-fecbde57-58e9-4df0-aab7-14888d1477cc
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID fecbde57-58e9-4df0-aab7-14888d1477cc
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:49:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:18.227 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'env', 'PROCESS_TAG=haproxy-fecbde57-58e9-4df0-aab7-14888d1477cc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fecbde57-58e9-4df0-aab7-14888d1477cc.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:49:18 compute-0 nova_compute[186662]: 2026-02-19 19:49:18.227 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:18 compute-0 podman[216916]: 2026-02-19 19:49:18.58145553 +0000 UTC m=+0.049409106 container create b2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 19 19:49:18 compute-0 systemd[1]: Started libpod-conmon-b2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791.scope.
Feb 19 19:49:18 compute-0 podman[216916]: 2026-02-19 19:49:18.555973624 +0000 UTC m=+0.023927210 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:49:18 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:49:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f60341e6c2dee28020fdf6d7a904ac060b622d11a299a7537ba4b812462ccfed/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:49:18 compute-0 podman[216916]: 2026-02-19 19:49:18.666385581 +0000 UTC m=+0.134339177 container init b2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Feb 19 19:49:18 compute-0 podman[216916]: 2026-02-19 19:49:18.714604087 +0000 UTC m=+0.182557653 container start b2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Feb 19 19:49:18 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[216931]: [NOTICE]   (216935) : New worker (216937) forked
Feb 19 19:49:18 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[216931]: [NOTICE]   (216935) : Loading success.
Feb 19 19:49:20 compute-0 ovn_controller[96653]: 2026-02-19T19:49:20Z|00197|binding|INFO|Claiming lport 29e10756-b344-4f35-97f0-615406e5c619 for this chassis.
Feb 19 19:49:20 compute-0 ovn_controller[96653]: 2026-02-19T19:49:20Z|00198|binding|INFO|29e10756-b344-4f35-97f0-615406e5c619: Claiming fa:16:3e:a1:c3:d3 10.100.0.11
Feb 19 19:49:20 compute-0 ovn_controller[96653]: 2026-02-19T19:49:20Z|00199|binding|INFO|Setting lport 29e10756-b344-4f35-97f0-615406e5c619 up in Southbound
Feb 19 19:49:21 compute-0 nova_compute[186662]: 2026-02-19 19:49:21.171 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:21 compute-0 nova_compute[186662]: 2026-02-19 19:49:21.263 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:21 compute-0 nova_compute[186662]: 2026-02-19 19:49:21.938 186666 INFO nova.compute.manager [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Post operation of migration started
Feb 19 19:49:21 compute-0 nova_compute[186662]: 2026-02-19 19:49:21.939 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:22 compute-0 nova_compute[186662]: 2026-02-19 19:49:22.230 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:22 compute-0 nova_compute[186662]: 2026-02-19 19:49:22.232 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:22 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:22.275 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:22 compute-0 nova_compute[186662]: 2026-02-19 19:49:22.321 186666 DEBUG oslo_concurrency.lockutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-7238b307-58b2-4472-a465-02c1d04441b1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:49:22 compute-0 nova_compute[186662]: 2026-02-19 19:49:22.322 186666 DEBUG oslo_concurrency.lockutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-7238b307-58b2-4472-a465-02c1d04441b1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:49:22 compute-0 nova_compute[186662]: 2026-02-19 19:49:22.322 186666 DEBUG nova.network.neutron [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:49:22 compute-0 nova_compute[186662]: 2026-02-19 19:49:22.913 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:23 compute-0 nova_compute[186662]: 2026-02-19 19:49:23.424 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:23 compute-0 nova_compute[186662]: 2026-02-19 19:49:23.558 186666 DEBUG nova.network.neutron [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Updating instance_info_cache with network_info: [{"id": "29e10756-b344-4f35-97f0-615406e5c619", "address": "fa:16:3e:a1:c3:d3", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e10756-b3", "ovs_interfaceid": "29e10756-b344-4f35-97f0-615406e5c619", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:49:24 compute-0 nova_compute[186662]: 2026-02-19 19:49:24.065 186666 DEBUG oslo_concurrency.lockutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-7238b307-58b2-4472-a465-02c1d04441b1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:49:24 compute-0 nova_compute[186662]: 2026-02-19 19:49:24.583 186666 DEBUG oslo_concurrency.lockutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:49:24 compute-0 nova_compute[186662]: 2026-02-19 19:49:24.584 186666 DEBUG oslo_concurrency.lockutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:49:24 compute-0 nova_compute[186662]: 2026-02-19 19:49:24.584 186666 DEBUG oslo_concurrency.lockutils [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:49:24 compute-0 nova_compute[186662]: 2026-02-19 19:49:24.596 186666 INFO nova.virt.libvirt.driver [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 19 19:49:24 compute-0 virtqemud[186157]: Domain id=18 name='instance-0000001b' uuid=7238b307-58b2-4472-a465-02c1d04441b1 is tainted: custom-monitor
Feb 19 19:49:25 compute-0 nova_compute[186662]: 2026-02-19 19:49:25.605 186666 INFO nova.virt.libvirt.driver [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 19 19:49:26 compute-0 nova_compute[186662]: 2026-02-19 19:49:26.180 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:26 compute-0 nova_compute[186662]: 2026-02-19 19:49:26.266 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:26 compute-0 nova_compute[186662]: 2026-02-19 19:49:26.610 186666 INFO nova.virt.libvirt.driver [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 19 19:49:26 compute-0 nova_compute[186662]: 2026-02-19 19:49:26.615 186666 DEBUG nova.compute.manager [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:49:27 compute-0 nova_compute[186662]: 2026-02-19 19:49:27.127 186666 DEBUG nova.objects.instance [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Feb 19 19:49:27 compute-0 nova_compute[186662]: 2026-02-19 19:49:27.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:49:28 compute-0 nova_compute[186662]: 2026-02-19 19:49:28.809 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:29 compute-0 nova_compute[186662]: 2026-02-19 19:49:29.234 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:29 compute-0 nova_compute[186662]: 2026-02-19 19:49:29.235 186666 WARNING neutronclient.v2_0.client [None req-bef9e2af-b43c-40ef-b6ee-4e69362bc89d 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:29 compute-0 nova_compute[186662]: 2026-02-19 19:49:29.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:49:29 compute-0 nova_compute[186662]: 2026-02-19 19:49:29.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:49:29 compute-0 podman[196025]: time="2026-02-19T19:49:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:49:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:49:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:49:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:49:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2660 "" "Go-http-client/1.1"
Feb 19 19:49:31 compute-0 nova_compute[186662]: 2026-02-19 19:49:31.185 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:31 compute-0 nova_compute[186662]: 2026-02-19 19:49:31.268 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:31 compute-0 openstack_network_exporter[198916]: ERROR   19:49:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:49:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:49:31 compute-0 openstack_network_exporter[198916]: ERROR   19:49:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:49:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:49:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:32.158 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:49:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:32.158 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:49:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:32.158 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:49:32 compute-0 nova_compute[186662]: 2026-02-19 19:49:32.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:49:32 compute-0 nova_compute[186662]: 2026-02-19 19:49:32.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:49:33 compute-0 nova_compute[186662]: 2026-02-19 19:49:33.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:49:34 compute-0 nova_compute[186662]: 2026-02-19 19:49:34.569 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:49:35 compute-0 nova_compute[186662]: 2026-02-19 19:49:35.083 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:49:35 compute-0 nova_compute[186662]: 2026-02-19 19:49:35.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:49:36 compute-0 nova_compute[186662]: 2026-02-19 19:49:36.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:49:36 compute-0 nova_compute[186662]: 2026-02-19 19:49:36.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:49:36 compute-0 nova_compute[186662]: 2026-02-19 19:49:36.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:49:36 compute-0 nova_compute[186662]: 2026-02-19 19:49:36.086 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:49:36 compute-0 nova_compute[186662]: 2026-02-19 19:49:36.190 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:36 compute-0 nova_compute[186662]: 2026-02-19 19:49:36.269 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:37 compute-0 nova_compute[186662]: 2026-02-19 19:49:37.122 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:37 compute-0 nova_compute[186662]: 2026-02-19 19:49:37.162 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:37 compute-0 nova_compute[186662]: 2026-02-19 19:49:37.163 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:37 compute-0 nova_compute[186662]: 2026-02-19 19:49:37.202 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:37 compute-0 podman[216968]: 2026-02-19 19:49:37.295698946 +0000 UTC m=+0.071032747 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 19 19:49:37 compute-0 nova_compute[186662]: 2026-02-19 19:49:37.308 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:49:37 compute-0 nova_compute[186662]: 2026-02-19 19:49:37.309 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:37 compute-0 nova_compute[186662]: 2026-02-19 19:49:37.320 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.011s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:37 compute-0 nova_compute[186662]: 2026-02-19 19:49:37.320 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5633MB free_disk=72.93855667114258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:49:37 compute-0 nova_compute[186662]: 2026-02-19 19:49:37.320 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:49:37 compute-0 nova_compute[186662]: 2026-02-19 19:49:37.320 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:49:37 compute-0 nova_compute[186662]: 2026-02-19 19:49:37.864 186666 DEBUG nova.compute.manager [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxw00s5wo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='444d0a0d-a760-4be8-9724-a1a6aede9547',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Feb 19 19:49:38 compute-0 nova_compute[186662]: 2026-02-19 19:49:38.846 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Migration for instance 444d0a0d-a760-4be8-9724-a1a6aede9547 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Feb 19 19:49:38 compute-0 nova_compute[186662]: 2026-02-19 19:49:38.877 186666 DEBUG oslo_concurrency.lockutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-444d0a0d-a760-4be8-9724-a1a6aede9547" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:49:38 compute-0 nova_compute[186662]: 2026-02-19 19:49:38.877 186666 DEBUG oslo_concurrency.lockutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-444d0a0d-a760-4be8-9724-a1a6aede9547" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:49:38 compute-0 nova_compute[186662]: 2026-02-19 19:49:38.878 186666 DEBUG nova.network.neutron [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:49:39 compute-0 nova_compute[186662]: 2026-02-19 19:49:39.355 186666 INFO nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Updating resource usage from migration 9dc16453-0ae7-49b4-bdb1-619d57a17c11
Feb 19 19:49:39 compute-0 nova_compute[186662]: 2026-02-19 19:49:39.356 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Starting to track incoming migration 9dc16453-0ae7-49b4-bdb1-619d57a17c11 with flavor 3881472c-99fb-4fe5-ab4d-bf6223e45537 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Feb 19 19:49:39 compute-0 nova_compute[186662]: 2026-02-19 19:49:39.385 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:39 compute-0 nova_compute[186662]: 2026-02-19 19:49:39.951 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 7238b307-58b2-4472-a465-02c1d04441b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:49:40 compute-0 nova_compute[186662]: 2026-02-19 19:49:40.457 186666 WARNING nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 444d0a0d-a760-4be8-9724-a1a6aede9547 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Feb 19 19:49:40 compute-0 nova_compute[186662]: 2026-02-19 19:49:40.457 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:49:40 compute-0 nova_compute[186662]: 2026-02-19 19:49:40.458 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:49:37 up  1:20,  0 user,  load average: 0.02, 0.11, 0.22\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_660950d25a914555816074d9de961374': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:49:40 compute-0 nova_compute[186662]: 2026-02-19 19:49:40.472 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing inventories for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Feb 19 19:49:40 compute-0 nova_compute[186662]: 2026-02-19 19:49:40.486 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating ProviderTree inventory for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Feb 19 19:49:40 compute-0 nova_compute[186662]: 2026-02-19 19:49:40.486 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:49:40 compute-0 nova_compute[186662]: 2026-02-19 19:49:40.501 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing aggregate associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Feb 19 19:49:40 compute-0 nova_compute[186662]: 2026-02-19 19:49:40.519 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing trait associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_SSE2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,HW_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_USB _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Feb 19 19:49:40 compute-0 nova_compute[186662]: 2026-02-19 19:49:40.569 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:49:40 compute-0 nova_compute[186662]: 2026-02-19 19:49:40.676 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:40 compute-0 nova_compute[186662]: 2026-02-19 19:49:40.880 186666 DEBUG nova.network.neutron [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Updating instance_info_cache with network_info: [{"id": "a17b08ae-dad6-4df5-97d5-e9c62e086b1e", "address": "fa:16:3e:b6:a5:da", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa17b08ae-da", "ovs_interfaceid": "a17b08ae-dad6-4df5-97d5-e9c62e086b1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.077 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.194 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.271 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:41 compute-0 podman[216989]: 2026-02-19 19:49:41.286495847 +0000 UTC m=+0.056289868 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, version=9.7, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal)
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.388 186666 DEBUG oslo_concurrency.lockutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-444d0a0d-a760-4be8-9724-a1a6aede9547" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.400 186666 DEBUG nova.virt.libvirt.driver [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxw00s5wo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='444d0a0d-a760-4be8-9724-a1a6aede9547',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.401 186666 DEBUG nova.virt.libvirt.driver [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Creating instance directory: /var/lib/nova/instances/444d0a0d-a760-4be8-9724-a1a6aede9547 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.401 186666 DEBUG nova.virt.libvirt.driver [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Creating disk.info with the contents: {'/var/lib/nova/instances/444d0a0d-a760-4be8-9724-a1a6aede9547/disk': 'qcow2', '/var/lib/nova/instances/444d0a0d-a760-4be8-9724-a1a6aede9547/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.401 186666 DEBUG nova.virt.libvirt.driver [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.402 186666 DEBUG nova.objects.instance [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 444d0a0d-a760-4be8-9724-a1a6aede9547 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.587 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.587 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.267s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.908 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.911 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.912 186666 DEBUG oslo_concurrency.processutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.957 186666 DEBUG oslo_concurrency.processutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.958 186666 DEBUG oslo_concurrency.lockutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.959 186666 DEBUG oslo_concurrency.lockutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.960 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.964 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:49:41 compute-0 nova_compute[186662]: 2026-02-19 19:49:41.965 186666 DEBUG oslo_concurrency.processutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.007 186666 DEBUG oslo_concurrency.processutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.009 186666 DEBUG oslo_concurrency.processutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/444d0a0d-a760-4be8-9724-a1a6aede9547/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.037 186666 DEBUG oslo_concurrency.processutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/444d0a0d-a760-4be8-9724-a1a6aede9547/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.038 186666 DEBUG oslo_concurrency.lockutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.079s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.039 186666 DEBUG oslo_concurrency.processutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.082 186666 DEBUG oslo_concurrency.processutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.083 186666 DEBUG nova.virt.disk.api [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Checking if we can resize image /var/lib/nova/instances/444d0a0d-a760-4be8-9724-a1a6aede9547/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.083 186666 DEBUG oslo_concurrency.processutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/444d0a0d-a760-4be8-9724-a1a6aede9547/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.134 186666 DEBUG oslo_concurrency.processutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/444d0a0d-a760-4be8-9724-a1a6aede9547/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.135 186666 DEBUG nova.virt.disk.api [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Cannot resize image /var/lib/nova/instances/444d0a0d-a760-4be8-9724-a1a6aede9547/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.135 186666 DEBUG nova.objects.instance [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid 444d0a0d-a760-4be8-9724-a1a6aede9547 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.642 186666 DEBUG nova.objects.base [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<444d0a0d-a760-4be8-9724-a1a6aede9547> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.643 186666 DEBUG oslo_concurrency.processutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/444d0a0d-a760-4be8-9724-a1a6aede9547/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.658 186666 DEBUG oslo_concurrency.processutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/444d0a0d-a760-4be8-9724-a1a6aede9547/disk.config 497664" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.658 186666 DEBUG nova.virt.libvirt.driver [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.659 186666 DEBUG nova.virt.libvirt.vif [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-02-19T19:47:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1674738929',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1674738',id=26,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:48:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='660950d25a914555816074d9de961374',ramdisk_id='',reservation_id='r-yt3g8axc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:48:10Z,user_data=None,user_id='a7713a2fa0c241d2a75d4f3d928cdc1f',uuid=444d0a0d-a760-4be8-9724-a1a6aede9547,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a17b08ae-dad6-4df5-97d5-e9c62e086b1e", "address": "fa:16:3e:b6:a5:da", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa17b08ae-da", "ovs_interfaceid": "a17b08ae-dad6-4df5-97d5-e9c62e086b1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.660 186666 DEBUG nova.network.os_vif_util [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "a17b08ae-dad6-4df5-97d5-e9c62e086b1e", "address": "fa:16:3e:b6:a5:da", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa17b08ae-da", "ovs_interfaceid": "a17b08ae-dad6-4df5-97d5-e9c62e086b1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.660 186666 DEBUG nova.network.os_vif_util [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:a5:da,bridge_name='br-int',has_traffic_filtering=True,id=a17b08ae-dad6-4df5-97d5-e9c62e086b1e,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa17b08ae-da') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.671 186666 DEBUG os_vif [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:a5:da,bridge_name='br-int',has_traffic_filtering=True,id=a17b08ae-dad6-4df5-97d5-e9c62e086b1e,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa17b08ae-da') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.672 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.673 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.673 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.674 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.674 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '66e4059f-202e-5bf9-b159-ceb378f6d081', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.675 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.676 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.678 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.678 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa17b08ae-da, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.678 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa17b08ae-da, col_values=(('qos', UUID('d198216f-26aa-4852-b730-527197a36801')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.679 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa17b08ae-da, col_values=(('external_ids', {'iface-id': 'a17b08ae-dad6-4df5-97d5-e9c62e086b1e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b6:a5:da', 'vm-uuid': '444d0a0d-a760-4be8-9724-a1a6aede9547'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.679 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:42 compute-0 NetworkManager[56519]: <info>  [1771530582.6809] manager: (tapa17b08ae-da): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.682 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.685 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.685 186666 INFO os_vif [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:a5:da,bridge_name='br-int',has_traffic_filtering=True,id=a17b08ae-dad6-4df5-97d5-e9c62e086b1e,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa17b08ae-da')
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.686 186666 DEBUG nova.virt.libvirt.driver [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.686 186666 DEBUG nova.compute.manager [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxw00s5wo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='444d0a0d-a760-4be8-9724-a1a6aede9547',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Feb 19 19:49:42 compute-0 nova_compute[186662]: 2026-02-19 19:49:42.687 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:43 compute-0 nova_compute[186662]: 2026-02-19 19:49:43.258 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:43 compute-0 podman[217030]: 2026-02-19 19:49:43.335157597 +0000 UTC m=+0.115791714 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 19:49:45 compute-0 nova_compute[186662]: 2026-02-19 19:49:45.243 186666 DEBUG nova.network.neutron [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Port a17b08ae-dad6-4df5-97d5-e9c62e086b1e updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Feb 19 19:49:45 compute-0 nova_compute[186662]: 2026-02-19 19:49:45.280 186666 DEBUG nova.compute.manager [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxw00s5wo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='444d0a0d-a760-4be8-9724-a1a6aede9547',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Feb 19 19:49:45 compute-0 nova_compute[186662]: 2026-02-19 19:49:45.588 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:49:46 compute-0 nova_compute[186662]: 2026-02-19 19:49:46.273 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:47 compute-0 podman[217060]: 2026-02-19 19:49:47.280423744 +0000 UTC m=+0.055696845 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:49:47 compute-0 nova_compute[186662]: 2026-02-19 19:49:47.681 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:48 compute-0 kernel: tapa17b08ae-da: entered promiscuous mode
Feb 19 19:49:48 compute-0 NetworkManager[56519]: <info>  [1771530588.0331] manager: (tapa17b08ae-da): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Feb 19 19:49:48 compute-0 ovn_controller[96653]: 2026-02-19T19:49:48Z|00200|binding|INFO|Claiming lport a17b08ae-dad6-4df5-97d5-e9c62e086b1e for this additional chassis.
Feb 19 19:49:48 compute-0 ovn_controller[96653]: 2026-02-19T19:49:48Z|00201|binding|INFO|a17b08ae-dad6-4df5-97d5-e9c62e086b1e: Claiming fa:16:3e:b6:a5:da 10.100.0.3
Feb 19 19:49:48 compute-0 nova_compute[186662]: 2026-02-19 19:49:48.033 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:48 compute-0 ovn_controller[96653]: 2026-02-19T19:49:48Z|00202|binding|INFO|Setting lport a17b08ae-dad6-4df5-97d5-e9c62e086b1e ovn-installed in OVS
Feb 19 19:49:48 compute-0 nova_compute[186662]: 2026-02-19 19:49:48.039 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.041 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:a5:da 10.100.0.3'], port_security=['fa:16:3e:b6:a5:da 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '444d0a0d-a760-4be8-9724-a1a6aede9547', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fecbde57-58e9-4df0-aab7-14888d1477cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660950d25a914555816074d9de961374', 'neutron:revision_number': '10', 'neutron:security_group_ids': '479d32ff-921d-41a5-a067-ea1f3b43c40b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b9fd371-626a-4670-b24e-8264e7c1e410, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=a17b08ae-dad6-4df5-97d5-e9c62e086b1e) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.042 105986 INFO neutron.agent.ovn.metadata.agent [-] Port a17b08ae-dad6-4df5-97d5-e9c62e086b1e in datapath fecbde57-58e9-4df0-aab7-14888d1477cc unbound from our chassis
Feb 19 19:49:48 compute-0 nova_compute[186662]: 2026-02-19 19:49:48.043 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.043 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fecbde57-58e9-4df0-aab7-14888d1477cc
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.053 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[3f99432f-f5d0-4e03-abbc-4cd75bb684d5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:48 compute-0 systemd-machined[156014]: New machine qemu-19-instance-0000001a.
Feb 19 19:49:48 compute-0 systemd-udevd[217101]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:49:48 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-0000001a.
Feb 19 19:49:48 compute-0 NetworkManager[56519]: <info>  [1771530588.0731] device (tapa17b08ae-da): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:49:48 compute-0 NetworkManager[56519]: <info>  [1771530588.0741] device (tapa17b08ae-da): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.077 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[0b70437b-1864-43e4-a7ce-cbaf57a15681]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.080 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[1c9c1b6a-d6d8-409d-8478-ca6a1a947549]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.100 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[a3adb598-ccc4-4f9d-9099-df1b2eebc5c9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.112 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a165ccab-1706-405e-a3fe-7d3282b37fea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfecbde57-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:02:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 26, 'tx_packets': 6, 'rx_bytes': 1372, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 26, 'tx_packets': 6, 'rx_bytes': 1372, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482766, 'reachable_time': 41879, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217112, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.124 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[15db879d-31e7-47f6-8618-9aa4a4340bcc]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfecbde57-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482773, 'tstamp': 482773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217113, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfecbde57-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482775, 'tstamp': 482775}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217113, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.126 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfecbde57-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:48 compute-0 nova_compute[186662]: 2026-02-19 19:49:48.129 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.129 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfecbde57-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.129 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.130 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfecbde57-50, col_values=(('external_ids', {'iface-id': '5190b70d-a81e-4df1-b581-b1a2cfd96252'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.130 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:49:48 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:48.131 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7a2167ea-8f7c-45d3-91e8-5c0439661971]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-fecbde57-58e9-4df0-aab7-14888d1477cc\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID fecbde57-58e9-4df0-aab7-14888d1477cc\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:49:51 compute-0 nova_compute[186662]: 2026-02-19 19:49:51.276 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:51 compute-0 nova_compute[186662]: 2026-02-19 19:49:51.351 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:51 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:51.351 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:49:51 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:51.353 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:49:51 compute-0 ovn_controller[96653]: 2026-02-19T19:49:51Z|00203|binding|INFO|Claiming lport a17b08ae-dad6-4df5-97d5-e9c62e086b1e for this chassis.
Feb 19 19:49:51 compute-0 ovn_controller[96653]: 2026-02-19T19:49:51Z|00204|binding|INFO|a17b08ae-dad6-4df5-97d5-e9c62e086b1e: Claiming fa:16:3e:b6:a5:da 10.100.0.3
Feb 19 19:49:51 compute-0 ovn_controller[96653]: 2026-02-19T19:49:51Z|00205|binding|INFO|Setting lport a17b08ae-dad6-4df5-97d5-e9c62e086b1e up in Southbound
Feb 19 19:49:52 compute-0 nova_compute[186662]: 2026-02-19 19:49:52.682 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:52 compute-0 nova_compute[186662]: 2026-02-19 19:49:52.766 186666 INFO nova.compute.manager [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Post operation of migration started
Feb 19 19:49:52 compute-0 nova_compute[186662]: 2026-02-19 19:49:52.766 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:53 compute-0 nova_compute[186662]: 2026-02-19 19:49:53.274 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:53 compute-0 nova_compute[186662]: 2026-02-19 19:49:53.274 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:53 compute-0 nova_compute[186662]: 2026-02-19 19:49:53.354 186666 DEBUG oslo_concurrency.lockutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-444d0a0d-a760-4be8-9724-a1a6aede9547" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:49:53 compute-0 nova_compute[186662]: 2026-02-19 19:49:53.355 186666 DEBUG oslo_concurrency.lockutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-444d0a0d-a760-4be8-9724-a1a6aede9547" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:49:53 compute-0 nova_compute[186662]: 2026-02-19 19:49:53.355 186666 DEBUG nova.network.neutron [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:49:53 compute-0 nova_compute[186662]: 2026-02-19 19:49:53.860 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:54 compute-0 nova_compute[186662]: 2026-02-19 19:49:54.528 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:49:55 compute-0 nova_compute[186662]: 2026-02-19 19:49:55.182 186666 DEBUG nova.network.neutron [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Updating instance_info_cache with network_info: [{"id": "a17b08ae-dad6-4df5-97d5-e9c62e086b1e", "address": "fa:16:3e:b6:a5:da", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa17b08ae-da", "ovs_interfaceid": "a17b08ae-dad6-4df5-97d5-e9c62e086b1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:49:55 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:49:55.354 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:49:55 compute-0 nova_compute[186662]: 2026-02-19 19:49:55.688 186666 DEBUG oslo_concurrency.lockutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-444d0a0d-a760-4be8-9724-a1a6aede9547" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:49:56 compute-0 nova_compute[186662]: 2026-02-19 19:49:56.204 186666 DEBUG oslo_concurrency.lockutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:49:56 compute-0 nova_compute[186662]: 2026-02-19 19:49:56.204 186666 DEBUG oslo_concurrency.lockutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:49:56 compute-0 nova_compute[186662]: 2026-02-19 19:49:56.205 186666 DEBUG oslo_concurrency.lockutils [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:49:56 compute-0 nova_compute[186662]: 2026-02-19 19:49:56.473 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:56 compute-0 nova_compute[186662]: 2026-02-19 19:49:56.476 186666 INFO nova.virt.libvirt.driver [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 19 19:49:56 compute-0 virtqemud[186157]: Domain id=19 name='instance-0000001a' uuid=444d0a0d-a760-4be8-9724-a1a6aede9547 is tainted: custom-monitor
Feb 19 19:49:57 compute-0 nova_compute[186662]: 2026-02-19 19:49:57.481 186666 INFO nova.virt.libvirt.driver [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 19 19:49:57 compute-0 nova_compute[186662]: 2026-02-19 19:49:57.720 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:49:58 compute-0 nova_compute[186662]: 2026-02-19 19:49:58.487 186666 INFO nova.virt.libvirt.driver [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 19 19:49:58 compute-0 nova_compute[186662]: 2026-02-19 19:49:58.492 186666 DEBUG nova.compute.manager [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:49:59 compute-0 nova_compute[186662]: 2026-02-19 19:49:59.005 186666 DEBUG nova.objects.instance [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Feb 19 19:49:59 compute-0 podman[196025]: time="2026-02-19T19:49:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:49:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:49:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:49:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:49:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2666 "" "Go-http-client/1.1"
Feb 19 19:50:00 compute-0 nova_compute[186662]: 2026-02-19 19:50:00.028 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:50:00 compute-0 nova_compute[186662]: 2026-02-19 19:50:00.249 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:50:00 compute-0 nova_compute[186662]: 2026-02-19 19:50:00.250 186666 WARNING neutronclient.v2_0.client [None req-f3fcc099-8ef2-4689-8ba6-96bf5d13347a 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:50:01 compute-0 openstack_network_exporter[198916]: ERROR   19:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:50:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:50:01 compute-0 openstack_network_exporter[198916]: ERROR   19:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:50:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:50:01 compute-0 nova_compute[186662]: 2026-02-19 19:50:01.475 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:02 compute-0 nova_compute[186662]: 2026-02-19 19:50:02.723 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:02 compute-0 nova_compute[186662]: 2026-02-19 19:50:02.745 186666 DEBUG oslo_concurrency.lockutils [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Acquiring lock "7238b307-58b2-4472-a465-02c1d04441b1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:50:02 compute-0 nova_compute[186662]: 2026-02-19 19:50:02.745 186666 DEBUG oslo_concurrency.lockutils [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "7238b307-58b2-4472-a465-02c1d04441b1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:50:02 compute-0 nova_compute[186662]: 2026-02-19 19:50:02.745 186666 DEBUG oslo_concurrency.lockutils [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Acquiring lock "7238b307-58b2-4472-a465-02c1d04441b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:50:02 compute-0 nova_compute[186662]: 2026-02-19 19:50:02.746 186666 DEBUG oslo_concurrency.lockutils [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "7238b307-58b2-4472-a465-02c1d04441b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:50:02 compute-0 nova_compute[186662]: 2026-02-19 19:50:02.746 186666 DEBUG oslo_concurrency.lockutils [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "7238b307-58b2-4472-a465-02c1d04441b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:50:02 compute-0 nova_compute[186662]: 2026-02-19 19:50:02.757 186666 INFO nova.compute.manager [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Terminating instance
Feb 19 19:50:03 compute-0 nova_compute[186662]: 2026-02-19 19:50:03.272 186666 DEBUG nova.compute.manager [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:50:03 compute-0 kernel: tap29e10756-b3 (unregistering): left promiscuous mode
Feb 19 19:50:03 compute-0 NetworkManager[56519]: <info>  [1771530603.3048] device (tap29e10756-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:50:03 compute-0 nova_compute[186662]: 2026-02-19 19:50:03.317 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:03 compute-0 ovn_controller[96653]: 2026-02-19T19:50:03Z|00206|binding|INFO|Releasing lport 29e10756-b344-4f35-97f0-615406e5c619 from this chassis (sb_readonly=0)
Feb 19 19:50:03 compute-0 ovn_controller[96653]: 2026-02-19T19:50:03Z|00207|binding|INFO|Setting lport 29e10756-b344-4f35-97f0-615406e5c619 down in Southbound
Feb 19 19:50:03 compute-0 ovn_controller[96653]: 2026-02-19T19:50:03Z|00208|binding|INFO|Removing iface tap29e10756-b3 ovn-installed in OVS
Feb 19 19:50:03 compute-0 nova_compute[186662]: 2026-02-19 19:50:03.323 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.325 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:c3:d3 10.100.0.11'], port_security=['fa:16:3e:a1:c3:d3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '7238b307-58b2-4472-a465-02c1d04441b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fecbde57-58e9-4df0-aab7-14888d1477cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660950d25a914555816074d9de961374', 'neutron:revision_number': '14', 'neutron:security_group_ids': '479d32ff-921d-41a5-a067-ea1f3b43c40b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b9fd371-626a-4670-b24e-8264e7c1e410, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=29e10756-b344-4f35-97f0-615406e5c619) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.327 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 29e10756-b344-4f35-97f0-615406e5c619 in datapath fecbde57-58e9-4df0-aab7-14888d1477cc unbound from our chassis
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.328 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fecbde57-58e9-4df0-aab7-14888d1477cc
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.348 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[819f8c34-8f31-4539-b07a-7297827b813b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:03 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Feb 19 19:50:03 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001b.scope: Consumed 2.892s CPU time.
Feb 19 19:50:03 compute-0 systemd-machined[156014]: Machine qemu-18-instance-0000001b terminated.
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.381 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad6d7aa-7d15-42a8-a2f4-43ee0ca8b5fc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.385 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[19f2e713-abac-43de-b50e-7a6877424b49]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.414 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[b80f7a60-7ca0-447e-98fb-9e6c1dad7772]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.434 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ed2fd11c-5f31-470d-8cc2-902760d073bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfecbde57-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:02:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 46, 'tx_packets': 8, 'rx_bytes': 2212, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 46, 'tx_packets': 8, 'rx_bytes': 2212, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482766, 'reachable_time': 41879, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217144, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.449 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[3040b974-e6c3-426d-8b87-0b76454017fe]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfecbde57-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482773, 'tstamp': 482773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217145, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfecbde57-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482775, 'tstamp': 482775}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217145, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.450 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfecbde57-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:50:03 compute-0 nova_compute[186662]: 2026-02-19 19:50:03.452 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.457 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfecbde57-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.457 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.457 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfecbde57-50, col_values=(('external_ids', {'iface-id': '5190b70d-a81e-4df1-b581-b1a2cfd96252'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.457 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:50:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:03.459 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b965ae21-d7e7-4d58-ac88-0018ac070ccb]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-fecbde57-58e9-4df0-aab7-14888d1477cc\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID fecbde57-58e9-4df0-aab7-14888d1477cc\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:03 compute-0 nova_compute[186662]: 2026-02-19 19:50:03.460 186666 DEBUG nova.compute.manager [req-318b4136-9818-403e-9c99-c04eb680ba82 req-f03bafca-08d2-4e22-8189-5f86535d3581 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Received event network-vif-unplugged-29e10756-b344-4f35-97f0-615406e5c619 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:50:03 compute-0 nova_compute[186662]: 2026-02-19 19:50:03.461 186666 DEBUG oslo_concurrency.lockutils [req-318b4136-9818-403e-9c99-c04eb680ba82 req-f03bafca-08d2-4e22-8189-5f86535d3581 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "7238b307-58b2-4472-a465-02c1d04441b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:50:03 compute-0 nova_compute[186662]: 2026-02-19 19:50:03.461 186666 DEBUG oslo_concurrency.lockutils [req-318b4136-9818-403e-9c99-c04eb680ba82 req-f03bafca-08d2-4e22-8189-5f86535d3581 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "7238b307-58b2-4472-a465-02c1d04441b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:50:03 compute-0 nova_compute[186662]: 2026-02-19 19:50:03.461 186666 DEBUG oslo_concurrency.lockutils [req-318b4136-9818-403e-9c99-c04eb680ba82 req-f03bafca-08d2-4e22-8189-5f86535d3581 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "7238b307-58b2-4472-a465-02c1d04441b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:50:03 compute-0 nova_compute[186662]: 2026-02-19 19:50:03.461 186666 DEBUG nova.compute.manager [req-318b4136-9818-403e-9c99-c04eb680ba82 req-f03bafca-08d2-4e22-8189-5f86535d3581 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] No waiting events found dispatching network-vif-unplugged-29e10756-b344-4f35-97f0-615406e5c619 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:50:03 compute-0 nova_compute[186662]: 2026-02-19 19:50:03.462 186666 DEBUG nova.compute.manager [req-318b4136-9818-403e-9c99-c04eb680ba82 req-f03bafca-08d2-4e22-8189-5f86535d3581 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Received event network-vif-unplugged-29e10756-b344-4f35-97f0-615406e5c619 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:50:03 compute-0 nova_compute[186662]: 2026-02-19 19:50:03.462 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:03 compute-0 nova_compute[186662]: 2026-02-19 19:50:03.529 186666 INFO nova.virt.libvirt.driver [-] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Instance destroyed successfully.
Feb 19 19:50:03 compute-0 nova_compute[186662]: 2026-02-19 19:50:03.529 186666 DEBUG nova.objects.instance [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lazy-loading 'resources' on Instance uuid 7238b307-58b2-4472-a465-02c1d04441b1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.035 186666 DEBUG nova.virt.libvirt.vif [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-02-19T19:48:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1378944667',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1378944',id=27,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:48:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='660950d25a914555816074d9de961374',ramdisk_id='',reservation_id='r-1q8i74ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',clean_attempts='1',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:49:27Z,user_data=None,user_id='a7713a2fa0c241d2a75d4f3d928cdc1f',uuid=7238b307-58b2-4472-a465-02c1d04441b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29e10756-b344-4f35-97f0-615406e5c619", "address": "fa:16:3e:a1:c3:d3", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e10756-b3", "ovs_interfaceid": "29e10756-b344-4f35-97f0-615406e5c619", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.036 186666 DEBUG nova.network.os_vif_util [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Converting VIF {"id": "29e10756-b344-4f35-97f0-615406e5c619", "address": "fa:16:3e:a1:c3:d3", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e10756-b3", "ovs_interfaceid": "29e10756-b344-4f35-97f0-615406e5c619", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.037 186666 DEBUG nova.network.os_vif_util [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a1:c3:d3,bridge_name='br-int',has_traffic_filtering=True,id=29e10756-b344-4f35-97f0-615406e5c619,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e10756-b3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.037 186666 DEBUG os_vif [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:c3:d3,bridge_name='br-int',has_traffic_filtering=True,id=29e10756-b344-4f35-97f0-615406e5c619,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e10756-b3') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.038 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.039 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29e10756-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.040 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.042 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.043 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.043 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=ecabef82-91ea-445d-b55a-a6f8a416c186) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.044 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.045 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.048 186666 INFO os_vif [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:c3:d3,bridge_name='br-int',has_traffic_filtering=True,id=29e10756-b344-4f35-97f0-615406e5c619,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e10756-b3')
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.048 186666 INFO nova.virt.libvirt.driver [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Deleting instance files /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1_del
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.049 186666 INFO nova.virt.libvirt.driver [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Deletion of /var/lib/nova/instances/7238b307-58b2-4472-a465-02c1d04441b1_del complete
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.557 186666 INFO nova.compute.manager [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Took 1.28 seconds to destroy the instance on the hypervisor.
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.558 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.558 186666 DEBUG nova.compute.manager [-] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.558 186666 DEBUG nova.network.neutron [-] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:50:04 compute-0 nova_compute[186662]: 2026-02-19 19:50:04.558 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:50:05 compute-0 nova_compute[186662]: 2026-02-19 19:50:05.163 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:50:05 compute-0 nova_compute[186662]: 2026-02-19 19:50:05.480 186666 DEBUG nova.compute.manager [req-758eab8a-c80c-4ade-8870-a94a9778ae1c req-44eeed60-dd6d-4b63-9d5d-8897d7ca7015 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Received event network-vif-deleted-29e10756-b344-4f35-97f0-615406e5c619 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:50:05 compute-0 nova_compute[186662]: 2026-02-19 19:50:05.480 186666 INFO nova.compute.manager [req-758eab8a-c80c-4ade-8870-a94a9778ae1c req-44eeed60-dd6d-4b63-9d5d-8897d7ca7015 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Neutron deleted interface 29e10756-b344-4f35-97f0-615406e5c619; detaching it from the instance and deleting it from the info cache
Feb 19 19:50:05 compute-0 nova_compute[186662]: 2026-02-19 19:50:05.481 186666 DEBUG nova.network.neutron [req-758eab8a-c80c-4ade-8870-a94a9778ae1c req-44eeed60-dd6d-4b63-9d5d-8897d7ca7015 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:50:05 compute-0 nova_compute[186662]: 2026-02-19 19:50:05.509 186666 DEBUG nova.compute.manager [req-1ad3816f-56db-421d-81f7-63f1d10bda3a req-7d587e3c-9137-40e2-a42e-cab30165820d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Received event network-vif-unplugged-29e10756-b344-4f35-97f0-615406e5c619 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:50:05 compute-0 nova_compute[186662]: 2026-02-19 19:50:05.509 186666 DEBUG oslo_concurrency.lockutils [req-1ad3816f-56db-421d-81f7-63f1d10bda3a req-7d587e3c-9137-40e2-a42e-cab30165820d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "7238b307-58b2-4472-a465-02c1d04441b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:50:05 compute-0 nova_compute[186662]: 2026-02-19 19:50:05.509 186666 DEBUG oslo_concurrency.lockutils [req-1ad3816f-56db-421d-81f7-63f1d10bda3a req-7d587e3c-9137-40e2-a42e-cab30165820d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "7238b307-58b2-4472-a465-02c1d04441b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:50:05 compute-0 nova_compute[186662]: 2026-02-19 19:50:05.509 186666 DEBUG oslo_concurrency.lockutils [req-1ad3816f-56db-421d-81f7-63f1d10bda3a req-7d587e3c-9137-40e2-a42e-cab30165820d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "7238b307-58b2-4472-a465-02c1d04441b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:50:05 compute-0 nova_compute[186662]: 2026-02-19 19:50:05.510 186666 DEBUG nova.compute.manager [req-1ad3816f-56db-421d-81f7-63f1d10bda3a req-7d587e3c-9137-40e2-a42e-cab30165820d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] No waiting events found dispatching network-vif-unplugged-29e10756-b344-4f35-97f0-615406e5c619 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:50:05 compute-0 nova_compute[186662]: 2026-02-19 19:50:05.510 186666 DEBUG nova.compute.manager [req-1ad3816f-56db-421d-81f7-63f1d10bda3a req-7d587e3c-9137-40e2-a42e-cab30165820d 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Received event network-vif-unplugged-29e10756-b344-4f35-97f0-615406e5c619 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:50:05 compute-0 nova_compute[186662]: 2026-02-19 19:50:05.935 186666 DEBUG nova.network.neutron [-] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:50:05 compute-0 nova_compute[186662]: 2026-02-19 19:50:05.988 186666 DEBUG nova.compute.manager [req-758eab8a-c80c-4ade-8870-a94a9778ae1c req-44eeed60-dd6d-4b63-9d5d-8897d7ca7015 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Detach interface failed, port_id=29e10756-b344-4f35-97f0-615406e5c619, reason: Instance 7238b307-58b2-4472-a465-02c1d04441b1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:50:06 compute-0 nova_compute[186662]: 2026-02-19 19:50:06.441 186666 INFO nova.compute.manager [-] [instance: 7238b307-58b2-4472-a465-02c1d04441b1] Took 1.88 seconds to deallocate network for instance.
Feb 19 19:50:06 compute-0 nova_compute[186662]: 2026-02-19 19:50:06.476 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:06 compute-0 nova_compute[186662]: 2026-02-19 19:50:06.957 186666 DEBUG oslo_concurrency.lockutils [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:50:06 compute-0 nova_compute[186662]: 2026-02-19 19:50:06.958 186666 DEBUG oslo_concurrency.lockutils [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:50:07 compute-0 nova_compute[186662]: 2026-02-19 19:50:07.016 186666 DEBUG nova.compute.provider_tree [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:50:07 compute-0 nova_compute[186662]: 2026-02-19 19:50:07.522 186666 DEBUG nova.scheduler.client.report [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:50:08 compute-0 nova_compute[186662]: 2026-02-19 19:50:08.030 186666 DEBUG oslo_concurrency.lockutils [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.072s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:50:08 compute-0 nova_compute[186662]: 2026-02-19 19:50:08.050 186666 INFO nova.scheduler.client.report [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Deleted allocations for instance 7238b307-58b2-4472-a465-02c1d04441b1
Feb 19 19:50:08 compute-0 podman[217164]: 2026-02-19 19:50:08.290810165 +0000 UTC m=+0.057078558 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 19 19:50:09 compute-0 nova_compute[186662]: 2026-02-19 19:50:09.045 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:09 compute-0 nova_compute[186662]: 2026-02-19 19:50:09.076 186666 DEBUG oslo_concurrency.lockutils [None req-2d338525-483f-41c5-b7e5-5bed01802bf0 a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "7238b307-58b2-4472-a465-02c1d04441b1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.331s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:50:09 compute-0 sshd-session[217184]: Received disconnect from 45.148.10.152 port 21264:11:  [preauth]
Feb 19 19:50:09 compute-0 sshd-session[217184]: Disconnected from authenticating user root 45.148.10.152 port 21264 [preauth]
Feb 19 19:50:10 compute-0 nova_compute[186662]: 2026-02-19 19:50:10.452 186666 DEBUG oslo_concurrency.lockutils [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Acquiring lock "444d0a0d-a760-4be8-9724-a1a6aede9547" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:50:10 compute-0 nova_compute[186662]: 2026-02-19 19:50:10.453 186666 DEBUG oslo_concurrency.lockutils [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "444d0a0d-a760-4be8-9724-a1a6aede9547" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:50:10 compute-0 nova_compute[186662]: 2026-02-19 19:50:10.453 186666 DEBUG oslo_concurrency.lockutils [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Acquiring lock "444d0a0d-a760-4be8-9724-a1a6aede9547-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:50:10 compute-0 nova_compute[186662]: 2026-02-19 19:50:10.453 186666 DEBUG oslo_concurrency.lockutils [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "444d0a0d-a760-4be8-9724-a1a6aede9547-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:50:10 compute-0 nova_compute[186662]: 2026-02-19 19:50:10.453 186666 DEBUG oslo_concurrency.lockutils [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "444d0a0d-a760-4be8-9724-a1a6aede9547-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:50:10 compute-0 nova_compute[186662]: 2026-02-19 19:50:10.464 186666 INFO nova.compute.manager [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Terminating instance
Feb 19 19:50:10 compute-0 nova_compute[186662]: 2026-02-19 19:50:10.980 186666 DEBUG nova.compute.manager [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:50:11 compute-0 kernel: tapa17b08ae-da (unregistering): left promiscuous mode
Feb 19 19:50:11 compute-0 NetworkManager[56519]: <info>  [1771530611.0164] device (tapa17b08ae-da): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.017 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.024 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:11 compute-0 ovn_controller[96653]: 2026-02-19T19:50:11Z|00209|binding|INFO|Releasing lport a17b08ae-dad6-4df5-97d5-e9c62e086b1e from this chassis (sb_readonly=0)
Feb 19 19:50:11 compute-0 ovn_controller[96653]: 2026-02-19T19:50:11Z|00210|binding|INFO|Setting lport a17b08ae-dad6-4df5-97d5-e9c62e086b1e down in Southbound
Feb 19 19:50:11 compute-0 ovn_controller[96653]: 2026-02-19T19:50:11Z|00211|binding|INFO|Removing iface tapa17b08ae-da ovn-installed in OVS
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.028 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.032 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.035 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:a5:da 10.100.0.3'], port_security=['fa:16:3e:b6:a5:da 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '444d0a0d-a760-4be8-9724-a1a6aede9547', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fecbde57-58e9-4df0-aab7-14888d1477cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660950d25a914555816074d9de961374', 'neutron:revision_number': '15', 'neutron:security_group_ids': '479d32ff-921d-41a5-a067-ea1f3b43c40b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b9fd371-626a-4670-b24e-8264e7c1e410, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=a17b08ae-dad6-4df5-97d5-e9c62e086b1e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.035 105986 INFO neutron.agent.ovn.metadata.agent [-] Port a17b08ae-dad6-4df5-97d5-e9c62e086b1e in datapath fecbde57-58e9-4df0-aab7-14888d1477cc unbound from our chassis
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.037 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fecbde57-58e9-4df0-aab7-14888d1477cc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.038 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[4f442c9b-2163-44ff-860b-4ee612e05d7b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.039 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc namespace which is not needed anymore
Feb 19 19:50:11 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Feb 19 19:50:11 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001a.scope: Consumed 2.511s CPU time.
Feb 19 19:50:11 compute-0 systemd-machined[156014]: Machine qemu-19-instance-0000001a terminated.
Feb 19 19:50:11 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[216931]: [NOTICE]   (216935) : haproxy version is 3.0.5-8e879a5
Feb 19 19:50:11 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[216931]: [NOTICE]   (216935) : path to executable is /usr/sbin/haproxy
Feb 19 19:50:11 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[216931]: [WARNING]  (216935) : Exiting Master process...
Feb 19 19:50:11 compute-0 podman[217209]: 2026-02-19 19:50:11.141317 +0000 UTC m=+0.024045725 container kill b2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest)
Feb 19 19:50:11 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[216931]: [ALERT]    (216935) : Current worker (216937) exited with code 143 (Terminated)
Feb 19 19:50:11 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[216931]: [WARNING]  (216935) : All workers exited. Exiting... (0)
Feb 19 19:50:11 compute-0 systemd[1]: libpod-b2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791.scope: Deactivated successfully.
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.162 186666 DEBUG nova.compute.manager [req-b0796aff-3bf3-4bd4-8536-5656af2ee964 req-694eb1a7-eaaa-4a23-89aa-771fe3fe6aa9 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Received event network-vif-unplugged-a17b08ae-dad6-4df5-97d5-e9c62e086b1e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.163 186666 DEBUG oslo_concurrency.lockutils [req-b0796aff-3bf3-4bd4-8536-5656af2ee964 req-694eb1a7-eaaa-4a23-89aa-771fe3fe6aa9 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "444d0a0d-a760-4be8-9724-a1a6aede9547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.163 186666 DEBUG oslo_concurrency.lockutils [req-b0796aff-3bf3-4bd4-8536-5656af2ee964 req-694eb1a7-eaaa-4a23-89aa-771fe3fe6aa9 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "444d0a0d-a760-4be8-9724-a1a6aede9547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.164 186666 DEBUG oslo_concurrency.lockutils [req-b0796aff-3bf3-4bd4-8536-5656af2ee964 req-694eb1a7-eaaa-4a23-89aa-771fe3fe6aa9 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "444d0a0d-a760-4be8-9724-a1a6aede9547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.164 186666 DEBUG nova.compute.manager [req-b0796aff-3bf3-4bd4-8536-5656af2ee964 req-694eb1a7-eaaa-4a23-89aa-771fe3fe6aa9 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] No waiting events found dispatching network-vif-unplugged-a17b08ae-dad6-4df5-97d5-e9c62e086b1e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.165 186666 DEBUG nova.compute.manager [req-b0796aff-3bf3-4bd4-8536-5656af2ee964 req-694eb1a7-eaaa-4a23-89aa-771fe3fe6aa9 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Received event network-vif-unplugged-a17b08ae-dad6-4df5-97d5-e9c62e086b1e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:50:11 compute-0 podman[217224]: 2026-02-19 19:50:11.184655103 +0000 UTC m=+0.030128824 container died b2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Feb 19 19:50:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791-userdata-shm.mount: Deactivated successfully.
Feb 19 19:50:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-f60341e6c2dee28020fdf6d7a904ac060b622d11a299a7537ba4b812462ccfed-merged.mount: Deactivated successfully.
Feb 19 19:50:11 compute-0 podman[217224]: 2026-02-19 19:50:11.223702042 +0000 UTC m=+0.069175753 container cleanup b2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:50:11 compute-0 systemd[1]: libpod-conmon-b2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791.scope: Deactivated successfully.
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.230 186666 INFO nova.virt.libvirt.driver [-] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Instance destroyed successfully.
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.231 186666 DEBUG nova.objects.instance [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lazy-loading 'resources' on Instance uuid 444d0a0d-a760-4be8-9724-a1a6aede9547 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:50:11 compute-0 podman[217226]: 2026-02-19 19:50:11.245724656 +0000 UTC m=+0.081214894 container remove b2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.249 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[9e3a1eff-2dcb-4b7c-b3d0-530b90ac9c53]: (4, ("Thu Feb 19 07:50:11 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc (b2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791)\nb2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791\nThu Feb 19 07:50:11 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc (b2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791)\nb2e95e15f16774fff04db705b9254e793fcc358423b5330869c24cbde9e6c791\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.250 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[af03e672-a57a-40d6-8eb6-07f5aa360c6e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.250 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.251 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[2975f873-8156-4f83-b13c-4172d39d3f68]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.251 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfecbde57-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.253 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:11 compute-0 kernel: tapfecbde57-50: left promiscuous mode
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.259 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.262 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ea949539-ed8a-4229-a721-057a7aae4dd7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.275 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d2680fd9-940f-4b99-b131-e2ebe5eb8f37]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.275 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c4035ca8-30d6-429b-9bed-e16d3fd60428]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.285 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b7727435-1ed3-437f-a67b-6a558e04177e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482758, 'reachable_time': 40675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217275, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.287 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:50:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:11.287 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[3d9ae865-e348-4ca3-afcf-62c4dc7f347c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:50:11 compute-0 systemd[1]: run-netns-ovnmeta\x2dfecbde57\x2d58e9\x2d4df0\x2daab7\x2d14888d1477cc.mount: Deactivated successfully.
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.477 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.737 186666 DEBUG nova.virt.libvirt.vif [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-02-19T19:47:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1674738929',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1674738',id=26,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:48:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='660950d25a914555816074d9de961374',ramdisk_id='',reservation_id='r-yt3g8axc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',clean_attempts='1',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:49:59Z,user_data=None,user_id='a7713a2fa0c241d2a75d4f3d928cdc1f',uuid=444d0a0d-a760-4be8-9724-a1a6aede9547,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a17b08ae-dad6-4df5-97d5-e9c62e086b1e", "address": "fa:16:3e:b6:a5:da", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa17b08ae-da", "ovs_interfaceid": "a17b08ae-dad6-4df5-97d5-e9c62e086b1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.737 186666 DEBUG nova.network.os_vif_util [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Converting VIF {"id": "a17b08ae-dad6-4df5-97d5-e9c62e086b1e", "address": "fa:16:3e:b6:a5:da", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa17b08ae-da", "ovs_interfaceid": "a17b08ae-dad6-4df5-97d5-e9c62e086b1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.738 186666 DEBUG nova.network.os_vif_util [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b6:a5:da,bridge_name='br-int',has_traffic_filtering=True,id=a17b08ae-dad6-4df5-97d5-e9c62e086b1e,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa17b08ae-da') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.739 186666 DEBUG os_vif [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:a5:da,bridge_name='br-int',has_traffic_filtering=True,id=a17b08ae-dad6-4df5-97d5-e9c62e086b1e,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa17b08ae-da') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.741 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.742 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa17b08ae-da, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.744 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.746 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.747 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.747 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d198216f-26aa-4852-b730-527197a36801) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.748 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.749 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.751 186666 INFO os_vif [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:a5:da,bridge_name='br-int',has_traffic_filtering=True,id=a17b08ae-dad6-4df5-97d5-e9c62e086b1e,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa17b08ae-da')
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.751 186666 INFO nova.virt.libvirt.driver [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Deleting instance files /var/lib/nova/instances/444d0a0d-a760-4be8-9724-a1a6aede9547_del
Feb 19 19:50:11 compute-0 nova_compute[186662]: 2026-02-19 19:50:11.752 186666 INFO nova.virt.libvirt.driver [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Deletion of /var/lib/nova/instances/444d0a0d-a760-4be8-9724-a1a6aede9547_del complete
Feb 19 19:50:12 compute-0 sshd-session[217186]: Received disconnect from 106.51.64.128 port 54641:11: Bye Bye [preauth]
Feb 19 19:50:12 compute-0 sshd-session[217186]: Disconnected from authenticating user root 106.51.64.128 port 54641 [preauth]
Feb 19 19:50:12 compute-0 nova_compute[186662]: 2026-02-19 19:50:12.262 186666 INFO nova.compute.manager [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Took 1.28 seconds to destroy the instance on the hypervisor.
Feb 19 19:50:12 compute-0 nova_compute[186662]: 2026-02-19 19:50:12.263 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:50:12 compute-0 nova_compute[186662]: 2026-02-19 19:50:12.263 186666 DEBUG nova.compute.manager [-] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:50:12 compute-0 nova_compute[186662]: 2026-02-19 19:50:12.263 186666 DEBUG nova.network.neutron [-] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:50:12 compute-0 nova_compute[186662]: 2026-02-19 19:50:12.263 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:50:12 compute-0 podman[217276]: 2026-02-19 19:50:12.288777531 +0000 UTC m=+0.053495181 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, version=9.7, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter)
Feb 19 19:50:13 compute-0 nova_compute[186662]: 2026-02-19 19:50:13.224 186666 DEBUG nova.compute.manager [req-3fd763cd-c055-4a6c-a2fc-929ae3479d68 req-6f7957c2-81f1-4fd9-bf46-a92324fe8562 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Received event network-vif-unplugged-a17b08ae-dad6-4df5-97d5-e9c62e086b1e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:50:13 compute-0 nova_compute[186662]: 2026-02-19 19:50:13.224 186666 DEBUG oslo_concurrency.lockutils [req-3fd763cd-c055-4a6c-a2fc-929ae3479d68 req-6f7957c2-81f1-4fd9-bf46-a92324fe8562 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "444d0a0d-a760-4be8-9724-a1a6aede9547-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:50:13 compute-0 nova_compute[186662]: 2026-02-19 19:50:13.225 186666 DEBUG oslo_concurrency.lockutils [req-3fd763cd-c055-4a6c-a2fc-929ae3479d68 req-6f7957c2-81f1-4fd9-bf46-a92324fe8562 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "444d0a0d-a760-4be8-9724-a1a6aede9547-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:50:13 compute-0 nova_compute[186662]: 2026-02-19 19:50:13.225 186666 DEBUG oslo_concurrency.lockutils [req-3fd763cd-c055-4a6c-a2fc-929ae3479d68 req-6f7957c2-81f1-4fd9-bf46-a92324fe8562 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "444d0a0d-a760-4be8-9724-a1a6aede9547-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:50:13 compute-0 nova_compute[186662]: 2026-02-19 19:50:13.225 186666 DEBUG nova.compute.manager [req-3fd763cd-c055-4a6c-a2fc-929ae3479d68 req-6f7957c2-81f1-4fd9-bf46-a92324fe8562 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] No waiting events found dispatching network-vif-unplugged-a17b08ae-dad6-4df5-97d5-e9c62e086b1e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:50:13 compute-0 nova_compute[186662]: 2026-02-19 19:50:13.225 186666 DEBUG nova.compute.manager [req-3fd763cd-c055-4a6c-a2fc-929ae3479d68 req-6f7957c2-81f1-4fd9-bf46-a92324fe8562 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Received event network-vif-unplugged-a17b08ae-dad6-4df5-97d5-e9c62e086b1e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:50:13 compute-0 nova_compute[186662]: 2026-02-19 19:50:13.251 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:50:13 compute-0 nova_compute[186662]: 2026-02-19 19:50:13.582 186666 DEBUG nova.compute.manager [req-c658b489-b9f0-47a6-86ac-9a65e9c1e11b req-b14daf49-a2a7-4138-89f8-a36e03b0c68f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Received event network-vif-deleted-a17b08ae-dad6-4df5-97d5-e9c62e086b1e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:50:13 compute-0 nova_compute[186662]: 2026-02-19 19:50:13.582 186666 INFO nova.compute.manager [req-c658b489-b9f0-47a6-86ac-9a65e9c1e11b req-b14daf49-a2a7-4138-89f8-a36e03b0c68f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Neutron deleted interface a17b08ae-dad6-4df5-97d5-e9c62e086b1e; detaching it from the instance and deleting it from the info cache
Feb 19 19:50:13 compute-0 nova_compute[186662]: 2026-02-19 19:50:13.583 186666 DEBUG nova.network.neutron [req-c658b489-b9f0-47a6-86ac-9a65e9c1e11b req-b14daf49-a2a7-4138-89f8-a36e03b0c68f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:50:14 compute-0 nova_compute[186662]: 2026-02-19 19:50:14.039 186666 DEBUG nova.network.neutron [-] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:50:14 compute-0 nova_compute[186662]: 2026-02-19 19:50:14.090 186666 DEBUG nova.compute.manager [req-c658b489-b9f0-47a6-86ac-9a65e9c1e11b req-b14daf49-a2a7-4138-89f8-a36e03b0c68f 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Detach interface failed, port_id=a17b08ae-dad6-4df5-97d5-e9c62e086b1e, reason: Instance 444d0a0d-a760-4be8-9724-a1a6aede9547 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:50:14 compute-0 podman[217297]: 2026-02-19 19:50:14.308273493 +0000 UTC m=+0.075056055 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 19 19:50:14 compute-0 nova_compute[186662]: 2026-02-19 19:50:14.545 186666 INFO nova.compute.manager [-] [instance: 444d0a0d-a760-4be8-9724-a1a6aede9547] Took 2.28 seconds to deallocate network for instance.
Feb 19 19:50:15 compute-0 nova_compute[186662]: 2026-02-19 19:50:15.065 186666 DEBUG oslo_concurrency.lockutils [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:50:15 compute-0 nova_compute[186662]: 2026-02-19 19:50:15.066 186666 DEBUG oslo_concurrency.lockutils [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:50:15 compute-0 nova_compute[186662]: 2026-02-19 19:50:15.072 186666 DEBUG oslo_concurrency.lockutils [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:50:15 compute-0 nova_compute[186662]: 2026-02-19 19:50:15.097 186666 INFO nova.scheduler.client.report [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Deleted allocations for instance 444d0a0d-a760-4be8-9724-a1a6aede9547
Feb 19 19:50:16 compute-0 nova_compute[186662]: 2026-02-19 19:50:16.120 186666 DEBUG oslo_concurrency.lockutils [None req-c332ff8f-d5ec-4bd0-b6ee-ab878bc18cff a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "444d0a0d-a760-4be8-9724-a1a6aede9547" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.667s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:50:16 compute-0 nova_compute[186662]: 2026-02-19 19:50:16.480 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:16 compute-0 nova_compute[186662]: 2026-02-19 19:50:16.749 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:18 compute-0 podman[217323]: 2026-02-19 19:50:18.267429395 +0000 UTC m=+0.044251256 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:50:19 compute-0 sshd-session[217346]: Invalid user cctv from 96.78.175.42 port 48714
Feb 19 19:50:19 compute-0 sshd-session[217346]: Received disconnect from 96.78.175.42 port 48714:11: Bye Bye [preauth]
Feb 19 19:50:19 compute-0 sshd-session[217346]: Disconnected from invalid user cctv 96.78.175.42 port 48714 [preauth]
Feb 19 19:50:21 compute-0 nova_compute[186662]: 2026-02-19 19:50:21.482 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:21 compute-0 nova_compute[186662]: 2026-02-19 19:50:21.751 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:26 compute-0 nova_compute[186662]: 2026-02-19 19:50:26.488 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:26 compute-0 nova_compute[186662]: 2026-02-19 19:50:26.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:50:26 compute-0 nova_compute[186662]: 2026-02-19 19:50:26.753 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:29 compute-0 podman[196025]: time="2026-02-19T19:50:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:50:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:50:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:50:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:50:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Feb 19 19:50:30 compute-0 nova_compute[186662]: 2026-02-19 19:50:30.080 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:50:30 compute-0 nova_compute[186662]: 2026-02-19 19:50:30.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:50:30 compute-0 nova_compute[186662]: 2026-02-19 19:50:30.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:50:31 compute-0 openstack_network_exporter[198916]: ERROR   19:50:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:50:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:50:31 compute-0 openstack_network_exporter[198916]: ERROR   19:50:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:50:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:50:31 compute-0 nova_compute[186662]: 2026-02-19 19:50:31.486 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:31 compute-0 nova_compute[186662]: 2026-02-19 19:50:31.755 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:32.159 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:50:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:32.159 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:50:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:32.160 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:50:32 compute-0 nova_compute[186662]: 2026-02-19 19:50:32.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:50:32 compute-0 nova_compute[186662]: 2026-02-19 19:50:32.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:50:33 compute-0 nova_compute[186662]: 2026-02-19 19:50:33.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:50:35 compute-0 nova_compute[186662]: 2026-02-19 19:50:35.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:50:36 compute-0 nova_compute[186662]: 2026-02-19 19:50:36.488 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:36 compute-0 nova_compute[186662]: 2026-02-19 19:50:36.756 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:37 compute-0 nova_compute[186662]: 2026-02-19 19:50:37.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:50:38 compute-0 nova_compute[186662]: 2026-02-19 19:50:38.095 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:50:38 compute-0 nova_compute[186662]: 2026-02-19 19:50:38.096 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:50:38 compute-0 nova_compute[186662]: 2026-02-19 19:50:38.096 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:50:38 compute-0 nova_compute[186662]: 2026-02-19 19:50:38.096 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:50:38 compute-0 nova_compute[186662]: 2026-02-19 19:50:38.228 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:50:38 compute-0 nova_compute[186662]: 2026-02-19 19:50:38.229 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:50:38 compute-0 nova_compute[186662]: 2026-02-19 19:50:38.241 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:50:38 compute-0 nova_compute[186662]: 2026-02-19 19:50:38.242 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5838MB free_disk=72.96760940551758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:50:38 compute-0 nova_compute[186662]: 2026-02-19 19:50:38.242 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:50:38 compute-0 nova_compute[186662]: 2026-02-19 19:50:38.243 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:50:39 compute-0 podman[217352]: 2026-02-19 19:50:39.269902123 +0000 UTC m=+0.048136961 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 19 19:50:39 compute-0 nova_compute[186662]: 2026-02-19 19:50:39.295 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:50:39 compute-0 nova_compute[186662]: 2026-02-19 19:50:39.296 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:50:38 up  1:21,  0 user,  load average: 0.12, 0.12, 0.21\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:50:39 compute-0 nova_compute[186662]: 2026-02-19 19:50:39.328 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:50:39 compute-0 nova_compute[186662]: 2026-02-19 19:50:39.836 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:50:40 compute-0 nova_compute[186662]: 2026-02-19 19:50:40.344 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:50:40 compute-0 nova_compute[186662]: 2026-02-19 19:50:40.344 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:50:41 compute-0 nova_compute[186662]: 2026-02-19 19:50:41.490 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:41 compute-0 nova_compute[186662]: 2026-02-19 19:50:41.757 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:43 compute-0 podman[217373]: 2026-02-19 19:50:43.270575215 +0000 UTC m=+0.051402801 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Feb 19 19:50:43 compute-0 nova_compute[186662]: 2026-02-19 19:50:43.345 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:50:45 compute-0 podman[217395]: 2026-02-19 19:50:45.281925169 +0000 UTC m=+0.062412868 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:50:46 compute-0 nova_compute[186662]: 2026-02-19 19:50:46.492 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:46 compute-0 nova_compute[186662]: 2026-02-19 19:50:46.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:50:46 compute-0 nova_compute[186662]: 2026-02-19 19:50:46.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Feb 19 19:50:46 compute-0 nova_compute[186662]: 2026-02-19 19:50:46.758 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:49 compute-0 podman[217422]: 2026-02-19 19:50:49.260098864 +0000 UTC m=+0.039088891 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:50:51 compute-0 nova_compute[186662]: 2026-02-19 19:50:51.493 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:51 compute-0 nova_compute[186662]: 2026-02-19 19:50:51.760 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:52 compute-0 nova_compute[186662]: 2026-02-19 19:50:52.087 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:50:52 compute-0 nova_compute[186662]: 2026-02-19 19:50:52.088 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Feb 19 19:50:52 compute-0 nova_compute[186662]: 2026-02-19 19:50:52.595 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Feb 19 19:50:54 compute-0 sshd-session[217448]: Invalid user manager from 103.67.78.251 port 48808
Feb 19 19:50:54 compute-0 sshd-session[217448]: Received disconnect from 103.67.78.251 port 48808:11: Bye Bye [preauth]
Feb 19 19:50:54 compute-0 sshd-session[217448]: Disconnected from invalid user manager 103.67.78.251 port 48808 [preauth]
Feb 19 19:50:56 compute-0 nova_compute[186662]: 2026-02-19 19:50:56.495 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:56 compute-0 nova_compute[186662]: 2026-02-19 19:50:56.762 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:57 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:57.803 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:50:57 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:50:57.804 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:50:57 compute-0 nova_compute[186662]: 2026-02-19 19:50:57.836 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:50:59 compute-0 podman[196025]: time="2026-02-19T19:50:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:50:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:50:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:50:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:50:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2200 "" "Go-http-client/1.1"
Feb 19 19:51:01 compute-0 openstack_network_exporter[198916]: ERROR   19:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:51:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:51:01 compute-0 openstack_network_exporter[198916]: ERROR   19:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:51:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:51:01 compute-0 nova_compute[186662]: 2026-02-19 19:51:01.497 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:01 compute-0 nova_compute[186662]: 2026-02-19 19:51:01.763 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:02 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:02.805 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:51:06 compute-0 nova_compute[186662]: 2026-02-19 19:51:06.497 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:06 compute-0 nova_compute[186662]: 2026-02-19 19:51:06.765 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:10 compute-0 podman[217451]: 2026-02-19 19:51:10.283985433 +0000 UTC m=+0.060100872 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 19 19:51:11 compute-0 nova_compute[186662]: 2026-02-19 19:51:11.499 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:11 compute-0 nova_compute[186662]: 2026-02-19 19:51:11.767 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:14 compute-0 podman[217470]: 2026-02-19 19:51:14.264889686 +0000 UTC m=+0.046351659 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, distribution-scope=public)
Feb 19 19:51:16 compute-0 podman[217492]: 2026-02-19 19:51:16.312456679 +0000 UTC m=+0.094041476 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0)
Feb 19 19:51:16 compute-0 nova_compute[186662]: 2026-02-19 19:51:16.500 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:16 compute-0 nova_compute[186662]: 2026-02-19 19:51:16.769 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:20 compute-0 podman[217521]: 2026-02-19 19:51:20.262842458 +0000 UTC m=+0.041353435 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:51:21 compute-0 nova_compute[186662]: 2026-02-19 19:51:21.500 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:21 compute-0 nova_compute[186662]: 2026-02-19 19:51:21.770 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:23 compute-0 nova_compute[186662]: 2026-02-19 19:51:23.221 186666 DEBUG nova.virt.libvirt.driver [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Creating tmpfile /var/lib/nova/instances/tmpurwlmjiu to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Feb 19 19:51:23 compute-0 nova_compute[186662]: 2026-02-19 19:51:23.223 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:23 compute-0 nova_compute[186662]: 2026-02-19 19:51:23.228 186666 DEBUG nova.compute.manager [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpurwlmjiu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Feb 19 19:51:23 compute-0 nova_compute[186662]: 2026-02-19 19:51:23.324 186666 DEBUG nova.virt.libvirt.driver [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Creating tmpfile /var/lib/nova/instances/tmp3uju6mwy to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Feb 19 19:51:23 compute-0 nova_compute[186662]: 2026-02-19 19:51:23.325 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:23 compute-0 nova_compute[186662]: 2026-02-19 19:51:23.329 186666 DEBUG nova.compute.manager [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3uju6mwy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Feb 19 19:51:25 compute-0 nova_compute[186662]: 2026-02-19 19:51:25.279 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:25 compute-0 nova_compute[186662]: 2026-02-19 19:51:25.358 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:26 compute-0 ovn_controller[96653]: 2026-02-19T19:51:26Z|00212|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Feb 19 19:51:26 compute-0 nova_compute[186662]: 2026-02-19 19:51:26.503 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:26 compute-0 nova_compute[186662]: 2026-02-19 19:51:26.772 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:29 compute-0 podman[196025]: time="2026-02-19T19:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:51:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:51:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Feb 19 19:51:30 compute-0 nova_compute[186662]: 2026-02-19 19:51:30.022 186666 DEBUG nova.compute.manager [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpurwlmjiu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3e898ccb-66b6-4591-8f6f-32c7ba62d150',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Feb 19 19:51:31 compute-0 nova_compute[186662]: 2026-02-19 19:51:31.039 186666 DEBUG oslo_concurrency.lockutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-3e898ccb-66b6-4591-8f6f-32c7ba62d150" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:51:31 compute-0 nova_compute[186662]: 2026-02-19 19:51:31.039 186666 DEBUG oslo_concurrency.lockutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-3e898ccb-66b6-4591-8f6f-32c7ba62d150" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:51:31 compute-0 nova_compute[186662]: 2026-02-19 19:51:31.039 186666 DEBUG nova.network.neutron [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:51:31 compute-0 nova_compute[186662]: 2026-02-19 19:51:31.083 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:51:31 compute-0 openstack_network_exporter[198916]: ERROR   19:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:51:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:51:31 compute-0 openstack_network_exporter[198916]: ERROR   19:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:51:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:51:31 compute-0 nova_compute[186662]: 2026-02-19 19:51:31.505 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:31 compute-0 nova_compute[186662]: 2026-02-19 19:51:31.545 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:31 compute-0 nova_compute[186662]: 2026-02-19 19:51:31.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:51:31 compute-0 nova_compute[186662]: 2026-02-19 19:51:31.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:51:31 compute-0 nova_compute[186662]: 2026-02-19 19:51:31.773 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:31 compute-0 nova_compute[186662]: 2026-02-19 19:51:31.829 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:31 compute-0 nova_compute[186662]: 2026-02-19 19:51:31.958 186666 DEBUG nova.network.neutron [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Updating instance_info_cache with network_info: [{"id": "0f888555-8d11-4dff-8761-be5721afca44", "address": "fa:16:3e:ba:70:5c", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f888555-8d", "ovs_interfaceid": "0f888555-8d11-4dff-8761-be5721afca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:51:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:32.161 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:51:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:32.161 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:51:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:32.161 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:51:32 compute-0 nova_compute[186662]: 2026-02-19 19:51:32.463 186666 DEBUG oslo_concurrency.lockutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-3e898ccb-66b6-4591-8f6f-32c7ba62d150" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:51:32 compute-0 nova_compute[186662]: 2026-02-19 19:51:32.478 186666 DEBUG nova.virt.libvirt.driver [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpurwlmjiu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3e898ccb-66b6-4591-8f6f-32c7ba62d150',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Feb 19 19:51:32 compute-0 nova_compute[186662]: 2026-02-19 19:51:32.479 186666 DEBUG nova.virt.libvirt.driver [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Creating instance directory: /var/lib/nova/instances/3e898ccb-66b6-4591-8f6f-32c7ba62d150 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Feb 19 19:51:32 compute-0 nova_compute[186662]: 2026-02-19 19:51:32.479 186666 DEBUG nova.virt.libvirt.driver [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Creating disk.info with the contents: {'/var/lib/nova/instances/3e898ccb-66b6-4591-8f6f-32c7ba62d150/disk': 'qcow2', '/var/lib/nova/instances/3e898ccb-66b6-4591-8f6f-32c7ba62d150/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Feb 19 19:51:32 compute-0 nova_compute[186662]: 2026-02-19 19:51:32.480 186666 DEBUG nova.virt.libvirt.driver [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Feb 19 19:51:32 compute-0 nova_compute[186662]: 2026-02-19 19:51:32.480 186666 DEBUG nova.objects.instance [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3e898ccb-66b6-4591-8f6f-32c7ba62d150 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:51:32 compute-0 nova_compute[186662]: 2026-02-19 19:51:32.985 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:51:32 compute-0 nova_compute[186662]: 2026-02-19 19:51:32.988 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:51:32 compute-0 nova_compute[186662]: 2026-02-19 19:51:32.990 186666 DEBUG oslo_concurrency.processutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.030 186666 DEBUG oslo_concurrency.processutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.031 186666 DEBUG oslo_concurrency.lockutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.032 186666 DEBUG oslo_concurrency.lockutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.033 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.037 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.038 186666 DEBUG oslo_concurrency.processutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.078 186666 DEBUG oslo_concurrency.processutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.079 186666 DEBUG oslo_concurrency.processutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/3e898ccb-66b6-4591-8f6f-32c7ba62d150/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.101 186666 DEBUG oslo_concurrency.processutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/3e898ccb-66b6-4591-8f6f-32c7ba62d150/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.101 186666 DEBUG oslo_concurrency.lockutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.069s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.102 186666 DEBUG oslo_concurrency.processutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.141 186666 DEBUG oslo_concurrency.processutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.141 186666 DEBUG nova.virt.disk.api [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Checking if we can resize image /var/lib/nova/instances/3e898ccb-66b6-4591-8f6f-32c7ba62d150/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.142 186666 DEBUG oslo_concurrency.processutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e898ccb-66b6-4591-8f6f-32c7ba62d150/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.183 186666 DEBUG oslo_concurrency.processutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e898ccb-66b6-4591-8f6f-32c7ba62d150/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.183 186666 DEBUG nova.virt.disk.api [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Cannot resize image /var/lib/nova/instances/3e898ccb-66b6-4591-8f6f-32c7ba62d150/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.184 186666 DEBUG nova.objects.instance [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid 3e898ccb-66b6-4591-8f6f-32c7ba62d150 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.692 186666 DEBUG nova.objects.base [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<3e898ccb-66b6-4591-8f6f-32c7ba62d150> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.693 186666 DEBUG oslo_concurrency.processutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3e898ccb-66b6-4591-8f6f-32c7ba62d150/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.709 186666 DEBUG oslo_concurrency.processutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3e898ccb-66b6-4591-8f6f-32c7ba62d150/disk.config 497664" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.710 186666 DEBUG nova.virt.libvirt.driver [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.711 186666 DEBUG nova.virt.libvirt.vif [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-02-19T19:50:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1020024042',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1020024',id=28,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:50:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='660950d25a914555816074d9de961374',ramdisk_id='',reservation_id='r-ao0zy72u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:50:39Z,user_data=None,user_id='a7713a2fa0c241d2a75d4f3d928cdc1f',uuid=3e898ccb-66b6-4591-8f6f-32c7ba62d150,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0f888555-8d11-4dff-8761-be5721afca44", "address": "fa:16:3e:ba:70:5c", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0f888555-8d", "ovs_interfaceid": "0f888555-8d11-4dff-8761-be5721afca44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.712 186666 DEBUG nova.network.os_vif_util [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "0f888555-8d11-4dff-8761-be5721afca44", "address": "fa:16:3e:ba:70:5c", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0f888555-8d", "ovs_interfaceid": "0f888555-8d11-4dff-8761-be5721afca44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.712 186666 DEBUG nova.network.os_vif_util [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:70:5c,bridge_name='br-int',has_traffic_filtering=True,id=0f888555-8d11-4dff-8761-be5721afca44,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f888555-8d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.713 186666 DEBUG os_vif [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:70:5c,bridge_name='br-int',has_traffic_filtering=True,id=0f888555-8d11-4dff-8761-be5721afca44,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f888555-8d') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.714 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.714 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.715 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.716 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.716 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e0143174-01bd-5bc5-a445-076e1dab528d', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.717 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.719 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.721 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.722 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f888555-8d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.722 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap0f888555-8d, col_values=(('qos', UUID('d6f81bd2-af7b-448b-a64e-ea8ca4ebf62b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.722 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap0f888555-8d, col_values=(('external_ids', {'iface-id': '0f888555-8d11-4dff-8761-be5721afca44', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:70:5c', 'vm-uuid': '3e898ccb-66b6-4591-8f6f-32c7ba62d150'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.723 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:33 compute-0 NetworkManager[56519]: <info>  [1771530693.7245] manager: (tap0f888555-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.725 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.727 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.728 186666 INFO os_vif [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:70:5c,bridge_name='br-int',has_traffic_filtering=True,id=0f888555-8d11-4dff-8761-be5721afca44,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f888555-8d')
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.729 186666 DEBUG nova.virt.libvirt.driver [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.729 186666 DEBUG nova.compute.manager [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpurwlmjiu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3e898ccb-66b6-4591-8f6f-32c7ba62d150',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Feb 19 19:51:33 compute-0 nova_compute[186662]: 2026-02-19 19:51:33.730 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:34 compute-0 nova_compute[186662]: 2026-02-19 19:51:34.295 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:34 compute-0 nova_compute[186662]: 2026-02-19 19:51:34.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:51:34 compute-0 nova_compute[186662]: 2026-02-19 19:51:34.576 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:51:35 compute-0 nova_compute[186662]: 2026-02-19 19:51:35.149 186666 DEBUG nova.network.neutron [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Port 0f888555-8d11-4dff-8761-be5721afca44 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Feb 19 19:51:35 compute-0 nova_compute[186662]: 2026-02-19 19:51:35.159 186666 DEBUG nova.compute.manager [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpurwlmjiu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3e898ccb-66b6-4591-8f6f-32c7ba62d150',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Feb 19 19:51:35 compute-0 nova_compute[186662]: 2026-02-19 19:51:35.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:51:36 compute-0 nova_compute[186662]: 2026-02-19 19:51:36.506 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:36 compute-0 nova_compute[186662]: 2026-02-19 19:51:36.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:51:37 compute-0 nova_compute[186662]: 2026-02-19 19:51:37.087 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:51:38 compute-0 nova_compute[186662]: 2026-02-19 19:51:38.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:51:38 compute-0 nova_compute[186662]: 2026-02-19 19:51:38.726 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:39 compute-0 nova_compute[186662]: 2026-02-19 19:51:39.089 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:51:39 compute-0 nova_compute[186662]: 2026-02-19 19:51:39.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:51:39 compute-0 nova_compute[186662]: 2026-02-19 19:51:39.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:51:39 compute-0 nova_compute[186662]: 2026-02-19 19:51:39.090 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:51:39 compute-0 nova_compute[186662]: 2026-02-19 19:51:39.218 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:51:39 compute-0 nova_compute[186662]: 2026-02-19 19:51:39.219 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:51:39 compute-0 nova_compute[186662]: 2026-02-19 19:51:39.237 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:51:39 compute-0 nova_compute[186662]: 2026-02-19 19:51:39.237 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5844MB free_disk=72.96703720092773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:51:39 compute-0 nova_compute[186662]: 2026-02-19 19:51:39.238 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:51:39 compute-0 nova_compute[186662]: 2026-02-19 19:51:39.238 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:51:40 compute-0 kernel: tap0f888555-8d: entered promiscuous mode
Feb 19 19:51:40 compute-0 NetworkManager[56519]: <info>  [1771530700.0081] manager: (tap0f888555-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Feb 19 19:51:40 compute-0 ovn_controller[96653]: 2026-02-19T19:51:40Z|00213|binding|INFO|Claiming lport 0f888555-8d11-4dff-8761-be5721afca44 for this additional chassis.
Feb 19 19:51:40 compute-0 ovn_controller[96653]: 2026-02-19T19:51:40Z|00214|binding|INFO|0f888555-8d11-4dff-8761-be5721afca44: Claiming fa:16:3e:ba:70:5c 10.100.0.14
Feb 19 19:51:40 compute-0 nova_compute[186662]: 2026-02-19 19:51:40.018 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:40 compute-0 nova_compute[186662]: 2026-02-19 19:51:40.020 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:40 compute-0 ovn_controller[96653]: 2026-02-19T19:51:40Z|00215|binding|INFO|Setting lport 0f888555-8d11-4dff-8761-be5721afca44 ovn-installed in OVS
Feb 19 19:51:40 compute-0 nova_compute[186662]: 2026-02-19 19:51:40.025 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.026 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:70:5c 10.100.0.14'], port_security=['fa:16:3e:ba:70:5c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3e898ccb-66b6-4591-8f6f-32c7ba62d150', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fecbde57-58e9-4df0-aab7-14888d1477cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660950d25a914555816074d9de961374', 'neutron:revision_number': '10', 'neutron:security_group_ids': '479d32ff-921d-41a5-a067-ea1f3b43c40b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b9fd371-626a-4670-b24e-8264e7c1e410, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=0f888555-8d11-4dff-8761-be5721afca44) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.027 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 0f888555-8d11-4dff-8761-be5721afca44 in datapath fecbde57-58e9-4df0-aab7-14888d1477cc unbound from our chassis
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.028 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fecbde57-58e9-4df0-aab7-14888d1477cc
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.034 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7bcc9cb9-1b8f-40dc-ad31-280af97b24b1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.035 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfecbde57-51 in ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.037 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfecbde57-50 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.037 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[81b1c1e3-1f77-4c1e-91cd-5f18584a23f0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 systemd-machined[156014]: New machine qemu-20-instance-0000001c.
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.038 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[0b9ef03f-d834-43b5-9142-8a072279e178]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 systemd-udevd[217584]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.047 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[8edc21e4-84c0-4127-87ca-ef4dd410b5d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 NetworkManager[56519]: <info>  [1771530700.0527] device (tap0f888555-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.053 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[6a710abf-dc0a-452f-a6f8-b256a9402ba9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 NetworkManager[56519]: <info>  [1771530700.0536] device (tap0f888555-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:51:40 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000001c.
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.072 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[8e62ab26-aac4-49c1-8146-33c8d5a11890]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 systemd-udevd[217588]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:51:40 compute-0 NetworkManager[56519]: <info>  [1771530700.0779] manager: (tapfecbde57-50): new Veth device (/org/freedesktop/NetworkManager/Devices/84)
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.077 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[260aff4b-448d-4ff7-b599-fa81bdc9f8a3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.100 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[8c6327ab-8976-439a-b94a-0a973a5ea85c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.102 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[1a3813f8-2b5b-4970-8414-1fce80afe888]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 NetworkManager[56519]: <info>  [1771530700.1174] device (tapfecbde57-50): carrier: link connected
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.121 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[8daae642-4c81-4b19-8211-10ca6adc151e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.131 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[db769987-8d1c-43b8-9ba7-6cb87d94ac52]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfecbde57-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:02:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496968, 'reachable_time': 34968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217616, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.148 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f1cc84b0-d664-481b-b58c-95d96b6a9e47]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:2e4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496968, 'tstamp': 496968}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217617, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.161 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[690607f4-57f3-4ab3-a8b6-128b98687d0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfecbde57-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:02:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496968, 'reachable_time': 34968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217618, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.191 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[def9ca52-4bc4-4a05-94fc-dab20dce7268]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.243 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[12038d4c-3c23-45c5-a280-b1a317eade33]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.244 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfecbde57-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.245 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.245 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfecbde57-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:51:40 compute-0 NetworkManager[56519]: <info>  [1771530700.2470] manager: (tapfecbde57-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Feb 19 19:51:40 compute-0 kernel: tapfecbde57-50: entered promiscuous mode
Feb 19 19:51:40 compute-0 nova_compute[186662]: 2026-02-19 19:51:40.246 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.248 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfecbde57-50, col_values=(('external_ids', {'iface-id': '5190b70d-a81e-4df1-b581-b1a2cfd96252'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:51:40 compute-0 ovn_controller[96653]: 2026-02-19T19:51:40Z|00216|binding|INFO|Releasing lport 5190b70d-a81e-4df1-b581-b1a2cfd96252 from this chassis (sb_readonly=0)
Feb 19 19:51:40 compute-0 nova_compute[186662]: 2026-02-19 19:51:40.252 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.254 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c3aec03d-6e82-4937-9b1c-53ef281b7596]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.254 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.255 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.255 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for fecbde57-58e9-4df0-aab7-14888d1477cc disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.255 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.255 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[acabe03e-e6ce-418c-b478-4f59895801d7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.255 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.256 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[462929cf-1006-4109-9372-c23214f524ea]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.256 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-fecbde57-58e9-4df0-aab7-14888d1477cc
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID fecbde57-58e9-4df0-aab7-14888d1477cc
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:51:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:51:40.257 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'env', 'PROCESS_TAG=haproxy-fecbde57-58e9-4df0-aab7-14888d1477cc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fecbde57-58e9-4df0-aab7-14888d1477cc.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:51:40 compute-0 nova_compute[186662]: 2026-02-19 19:51:40.256 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Migration for instance 3e898ccb-66b6-4591-8f6f-32c7ba62d150 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Feb 19 19:51:40 compute-0 nova_compute[186662]: 2026-02-19 19:51:40.259 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Migration for instance 898cad44-87cf-4f81-a0b5-52a780c02ca8 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Feb 19 19:51:40 compute-0 podman[217656]: 2026-02-19 19:51:40.56503094 +0000 UTC m=+0.037678327 container create bee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:51:40 compute-0 systemd[1]: Started libpod-conmon-bee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc.scope.
Feb 19 19:51:40 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:51:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a5365a9830605f29216d04ea7c386785277cb90da12c9d7eeeeed426d80a3e8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:51:40 compute-0 podman[217656]: 2026-02-19 19:51:40.6161143 +0000 UTC m=+0.088761697 container init bee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Feb 19 19:51:40 compute-0 podman[217656]: 2026-02-19 19:51:40.621522542 +0000 UTC m=+0.094169929 container start bee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest)
Feb 19 19:51:40 compute-0 podman[217656]: 2026-02-19 19:51:40.546220042 +0000 UTC m=+0.018867449 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:51:40 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[217672]: [NOTICE]   (217692) : New worker (217696) forked
Feb 19 19:51:40 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[217672]: [NOTICE]   (217692) : Loading success.
Feb 19 19:51:40 compute-0 podman[217669]: 2026-02-19 19:51:40.662916297 +0000 UTC m=+0.070560705 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Feb 19 19:51:41 compute-0 nova_compute[186662]: 2026-02-19 19:51:41.290 186666 INFO nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Updating resource usage from migration 907dd3a9-6ce7-431a-ab1a-e66278f5414b
Feb 19 19:51:41 compute-0 nova_compute[186662]: 2026-02-19 19:51:41.291 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Starting to track incoming migration 907dd3a9-6ce7-431a-ab1a-e66278f5414b with flavor 3881472c-99fb-4fe5-ab4d-bf6223e45537 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Feb 19 19:51:41 compute-0 nova_compute[186662]: 2026-02-19 19:51:41.508 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:41 compute-0 nova_compute[186662]: 2026-02-19 19:51:41.803 186666 INFO nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Updating resource usage from migration 12b8b405-d017-4430-9e38-0ffbe853c13b
Feb 19 19:51:41 compute-0 nova_compute[186662]: 2026-02-19 19:51:41.803 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Starting to track incoming migration 12b8b405-d017-4430-9e38-0ffbe853c13b with flavor 3881472c-99fb-4fe5-ab4d-bf6223e45537 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Feb 19 19:51:42 compute-0 ovn_controller[96653]: 2026-02-19T19:51:42Z|00217|binding|INFO|Claiming lport 0f888555-8d11-4dff-8761-be5721afca44 for this chassis.
Feb 19 19:51:42 compute-0 ovn_controller[96653]: 2026-02-19T19:51:42Z|00218|binding|INFO|0f888555-8d11-4dff-8761-be5721afca44: Claiming fa:16:3e:ba:70:5c 10.100.0.14
Feb 19 19:51:42 compute-0 ovn_controller[96653]: 2026-02-19T19:51:42Z|00219|binding|INFO|Setting lport 0f888555-8d11-4dff-8761-be5721afca44 up in Southbound
Feb 19 19:51:42 compute-0 nova_compute[186662]: 2026-02-19 19:51:42.848 186666 WARNING nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 3e898ccb-66b6-4591-8f6f-32c7ba62d150 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Feb 19 19:51:43 compute-0 nova_compute[186662]: 2026-02-19 19:51:43.356 186666 WARNING nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 898cad44-87cf-4f81-a0b5-52a780c02ca8 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Feb 19 19:51:43 compute-0 nova_compute[186662]: 2026-02-19 19:51:43.356 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:51:43 compute-0 nova_compute[186662]: 2026-02-19 19:51:43.356 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:51:39 up  1:22,  0 user,  load average: 0.12, 0.11, 0.20\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:51:43 compute-0 nova_compute[186662]: 2026-02-19 19:51:43.488 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:51:43 compute-0 nova_compute[186662]: 2026-02-19 19:51:43.729 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:43 compute-0 nova_compute[186662]: 2026-02-19 19:51:43.995 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:51:44 compute-0 nova_compute[186662]: 2026-02-19 19:51:44.409 186666 INFO nova.compute.manager [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Post operation of migration started
Feb 19 19:51:44 compute-0 nova_compute[186662]: 2026-02-19 19:51:44.410 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:44 compute-0 nova_compute[186662]: 2026-02-19 19:51:44.504 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:51:44 compute-0 nova_compute[186662]: 2026-02-19 19:51:44.504 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.266s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:51:45 compute-0 nova_compute[186662]: 2026-02-19 19:51:45.151 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:45 compute-0 nova_compute[186662]: 2026-02-19 19:51:45.151 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:45 compute-0 nova_compute[186662]: 2026-02-19 19:51:45.234 186666 DEBUG oslo_concurrency.lockutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-3e898ccb-66b6-4591-8f6f-32c7ba62d150" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:51:45 compute-0 nova_compute[186662]: 2026-02-19 19:51:45.235 186666 DEBUG oslo_concurrency.lockutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-3e898ccb-66b6-4591-8f6f-32c7ba62d150" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:51:45 compute-0 nova_compute[186662]: 2026-02-19 19:51:45.235 186666 DEBUG nova.network.neutron [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:51:45 compute-0 podman[217717]: 2026-02-19 19:51:45.268742224 +0000 UTC m=+0.046884061 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7)
Feb 19 19:51:45 compute-0 nova_compute[186662]: 2026-02-19 19:51:45.741 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:46 compute-0 nova_compute[186662]: 2026-02-19 19:51:46.503 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:51:46 compute-0 nova_compute[186662]: 2026-02-19 19:51:46.510 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:46 compute-0 nova_compute[186662]: 2026-02-19 19:51:46.592 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:46 compute-0 nova_compute[186662]: 2026-02-19 19:51:46.717 186666 DEBUG nova.network.neutron [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Updating instance_info_cache with network_info: [{"id": "0f888555-8d11-4dff-8761-be5721afca44", "address": "fa:16:3e:ba:70:5c", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f888555-8d", "ovs_interfaceid": "0f888555-8d11-4dff-8761-be5721afca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:51:47 compute-0 nova_compute[186662]: 2026-02-19 19:51:47.223 186666 DEBUG oslo_concurrency.lockutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-3e898ccb-66b6-4591-8f6f-32c7ba62d150" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:51:47 compute-0 podman[217739]: 2026-02-19 19:51:47.289738252 +0000 UTC m=+0.066412885 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:51:47 compute-0 nova_compute[186662]: 2026-02-19 19:51:47.743 186666 DEBUG oslo_concurrency.lockutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:51:47 compute-0 nova_compute[186662]: 2026-02-19 19:51:47.743 186666 DEBUG oslo_concurrency.lockutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:51:47 compute-0 nova_compute[186662]: 2026-02-19 19:51:47.743 186666 DEBUG oslo_concurrency.lockutils [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:51:47 compute-0 nova_compute[186662]: 2026-02-19 19:51:47.747 186666 INFO nova.virt.libvirt.driver [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 19 19:51:47 compute-0 virtqemud[186157]: Domain id=20 name='instance-0000001c' uuid=3e898ccb-66b6-4591-8f6f-32c7ba62d150 is tainted: custom-monitor
Feb 19 19:51:48 compute-0 nova_compute[186662]: 2026-02-19 19:51:48.752 186666 INFO nova.virt.libvirt.driver [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 19 19:51:48 compute-0 nova_compute[186662]: 2026-02-19 19:51:48.772 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:49 compute-0 nova_compute[186662]: 2026-02-19 19:51:49.756 186666 INFO nova.virt.libvirt.driver [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 19 19:51:49 compute-0 nova_compute[186662]: 2026-02-19 19:51:49.759 186666 DEBUG nova.compute.manager [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:51:50 compute-0 nova_compute[186662]: 2026-02-19 19:51:50.276 186666 DEBUG nova.objects.instance [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Feb 19 19:51:51 compute-0 podman[217767]: 2026-02-19 19:51:51.258219981 +0000 UTC m=+0.041149251 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:51:51 compute-0 nova_compute[186662]: 2026-02-19 19:51:51.292 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:51 compute-0 nova_compute[186662]: 2026-02-19 19:51:51.512 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:52 compute-0 nova_compute[186662]: 2026-02-19 19:51:52.301 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:52 compute-0 nova_compute[186662]: 2026-02-19 19:51:52.301 186666 WARNING neutronclient.v2_0.client [None req-aa5a16ca-71a4-4c47-b6d1-71c156dc104e 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:51:53 compute-0 nova_compute[186662]: 2026-02-19 19:51:53.774 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:56 compute-0 nova_compute[186662]: 2026-02-19 19:51:56.513 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:58 compute-0 sshd-session[217794]: Received disconnect from 45.169.200.254 port 51554:11: Bye Bye [preauth]
Feb 19 19:51:58 compute-0 sshd-session[217794]: Disconnected from authenticating user root 45.169.200.254 port 51554 [preauth]
Feb 19 19:51:58 compute-0 nova_compute[186662]: 2026-02-19 19:51:58.776 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:51:59 compute-0 sshd-session[217796]: Invalid user mohammad from 197.211.55.20 port 38746
Feb 19 19:51:59 compute-0 sshd-session[217796]: Received disconnect from 197.211.55.20 port 38746:11: Bye Bye [preauth]
Feb 19 19:51:59 compute-0 sshd-session[217796]: Disconnected from invalid user mohammad 197.211.55.20 port 38746 [preauth]
Feb 19 19:51:59 compute-0 podman[196025]: time="2026-02-19T19:51:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:51:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:51:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:51:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:51:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2660 "" "Go-http-client/1.1"
Feb 19 19:52:01 compute-0 nova_compute[186662]: 2026-02-19 19:52:01.128 186666 DEBUG nova.compute.manager [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3uju6mwy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='898cad44-87cf-4f81-a0b5-52a780c02ca8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Feb 19 19:52:01 compute-0 openstack_network_exporter[198916]: ERROR   19:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:52:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:52:01 compute-0 openstack_network_exporter[198916]: ERROR   19:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:52:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:52:01 compute-0 nova_compute[186662]: 2026-02-19 19:52:01.514 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:02 compute-0 nova_compute[186662]: 2026-02-19 19:52:02.143 186666 DEBUG oslo_concurrency.lockutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-898cad44-87cf-4f81-a0b5-52a780c02ca8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:52:02 compute-0 nova_compute[186662]: 2026-02-19 19:52:02.143 186666 DEBUG oslo_concurrency.lockutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-898cad44-87cf-4f81-a0b5-52a780c02ca8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:52:02 compute-0 nova_compute[186662]: 2026-02-19 19:52:02.144 186666 DEBUG nova.network.neutron [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:52:02 compute-0 nova_compute[186662]: 2026-02-19 19:52:02.649 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:03 compute-0 nova_compute[186662]: 2026-02-19 19:52:03.581 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:03 compute-0 nova_compute[186662]: 2026-02-19 19:52:03.818 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:04 compute-0 nova_compute[186662]: 2026-02-19 19:52:04.340 186666 DEBUG nova.network.neutron [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Updating instance_info_cache with network_info: [{"id": "ba35ea8b-14f8-4551-bd8f-febdd0d70b3d", "address": "fa:16:3e:5a:b5:95", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba35ea8b-14", "ovs_interfaceid": "ba35ea8b-14f8-4551-bd8f-febdd0d70b3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:52:04 compute-0 nova_compute[186662]: 2026-02-19 19:52:04.847 186666 DEBUG oslo_concurrency.lockutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-898cad44-87cf-4f81-a0b5-52a780c02ca8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:52:04 compute-0 nova_compute[186662]: 2026-02-19 19:52:04.860 186666 DEBUG nova.virt.libvirt.driver [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3uju6mwy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='898cad44-87cf-4f81-a0b5-52a780c02ca8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Feb 19 19:52:04 compute-0 nova_compute[186662]: 2026-02-19 19:52:04.860 186666 DEBUG nova.virt.libvirt.driver [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Creating instance directory: /var/lib/nova/instances/898cad44-87cf-4f81-a0b5-52a780c02ca8 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Feb 19 19:52:04 compute-0 nova_compute[186662]: 2026-02-19 19:52:04.861 186666 DEBUG nova.virt.libvirt.driver [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Creating disk.info with the contents: {'/var/lib/nova/instances/898cad44-87cf-4f81-a0b5-52a780c02ca8/disk': 'qcow2', '/var/lib/nova/instances/898cad44-87cf-4f81-a0b5-52a780c02ca8/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Feb 19 19:52:04 compute-0 nova_compute[186662]: 2026-02-19 19:52:04.861 186666 DEBUG nova.virt.libvirt.driver [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Feb 19 19:52:04 compute-0 nova_compute[186662]: 2026-02-19 19:52:04.861 186666 DEBUG nova.objects.instance [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 898cad44-87cf-4f81-a0b5-52a780c02ca8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.367 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.370 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.372 186666 DEBUG oslo_concurrency.processutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.412 186666 DEBUG oslo_concurrency.processutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.413 186666 DEBUG oslo_concurrency.lockutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.414 186666 DEBUG oslo_concurrency.lockutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.415 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.418 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.419 186666 DEBUG oslo_concurrency.processutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.459 186666 DEBUG oslo_concurrency.processutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.460 186666 DEBUG oslo_concurrency.processutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/898cad44-87cf-4f81-a0b5-52a780c02ca8/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.481 186666 DEBUG oslo_concurrency.processutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/898cad44-87cf-4f81-a0b5-52a780c02ca8/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.482 186666 DEBUG oslo_concurrency.lockutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.069s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.483 186666 DEBUG oslo_concurrency.processutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.521 186666 DEBUG oslo_concurrency.processutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.522 186666 DEBUG nova.virt.disk.api [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Checking if we can resize image /var/lib/nova/instances/898cad44-87cf-4f81-a0b5-52a780c02ca8/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.522 186666 DEBUG oslo_concurrency.processutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/898cad44-87cf-4f81-a0b5-52a780c02ca8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.562 186666 DEBUG oslo_concurrency.processutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/898cad44-87cf-4f81-a0b5-52a780c02ca8/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.562 186666 DEBUG nova.virt.disk.api [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Cannot resize image /var/lib/nova/instances/898cad44-87cf-4f81-a0b5-52a780c02ca8/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:52:05 compute-0 nova_compute[186662]: 2026-02-19 19:52:05.562 186666 DEBUG nova.objects.instance [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid 898cad44-87cf-4f81-a0b5-52a780c02ca8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.070 186666 DEBUG nova.objects.base [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<898cad44-87cf-4f81-a0b5-52a780c02ca8> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.071 186666 DEBUG oslo_concurrency.processutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/898cad44-87cf-4f81-a0b5-52a780c02ca8/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.086 186666 DEBUG oslo_concurrency.processutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/898cad44-87cf-4f81-a0b5-52a780c02ca8/disk.config 497664" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.087 186666 DEBUG nova.virt.libvirt.driver [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.088 186666 DEBUG nova.virt.libvirt.vif [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-02-19T19:50:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-545163397',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-5451633',id=29,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:50:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='660950d25a914555816074d9de961374',ramdisk_id='',reservation_id='r-094cl00u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:50:58Z,user_data=None,user_id='a7713a2fa0c241d2a75d4f3d928cdc1f',uuid=898cad44-87cf-4f81-a0b5-52a780c02ca8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ba35ea8b-14f8-4551-bd8f-febdd0d70b3d", "address": "fa:16:3e:5a:b5:95", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapba35ea8b-14", "ovs_interfaceid": "ba35ea8b-14f8-4551-bd8f-febdd0d70b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.088 186666 DEBUG nova.network.os_vif_util [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "ba35ea8b-14f8-4551-bd8f-febdd0d70b3d", "address": "fa:16:3e:5a:b5:95", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapba35ea8b-14", "ovs_interfaceid": "ba35ea8b-14f8-4551-bd8f-febdd0d70b3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.089 186666 DEBUG nova.network.os_vif_util [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:b5:95,bridge_name='br-int',has_traffic_filtering=True,id=ba35ea8b-14f8-4551-bd8f-febdd0d70b3d,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapba35ea8b-14') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.089 186666 DEBUG os_vif [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:b5:95,bridge_name='br-int',has_traffic_filtering=True,id=ba35ea8b-14f8-4551-bd8f-febdd0d70b3d,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapba35ea8b-14') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.090 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.090 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.091 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.091 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.092 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '26f6dc16-e226-54ab-8304-85dc17b24d76', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.093 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.094 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.096 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.096 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapba35ea8b-14, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.097 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapba35ea8b-14, col_values=(('qos', UUID('44a0ef84-1093-4f21-ab32-60ae0cc7c093')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.097 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapba35ea8b-14, col_values=(('external_ids', {'iface-id': 'ba35ea8b-14f8-4551-bd8f-febdd0d70b3d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5a:b5:95', 'vm-uuid': '898cad44-87cf-4f81-a0b5-52a780c02ca8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.098 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:06 compute-0 NetworkManager[56519]: <info>  [1771530726.0988] manager: (tapba35ea8b-14): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.100 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.105 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.105 186666 INFO os_vif [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:b5:95,bridge_name='br-int',has_traffic_filtering=True,id=ba35ea8b-14f8-4551-bd8f-febdd0d70b3d,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapba35ea8b-14')
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.105 186666 DEBUG nova.virt.libvirt.driver [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.106 186666 DEBUG nova.compute.manager [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3uju6mwy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='898cad44-87cf-4f81-a0b5-52a780c02ca8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.106 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.306 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.516 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.894 186666 DEBUG nova.network.neutron [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Port ba35ea8b-14f8-4551-bd8f-febdd0d70b3d updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Feb 19 19:52:06 compute-0 nova_compute[186662]: 2026-02-19 19:52:06.905 186666 DEBUG nova.compute.manager [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3uju6mwy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='898cad44-87cf-4f81-a0b5-52a780c02ca8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Feb 19 19:52:10 compute-0 NetworkManager[56519]: <info>  [1771530730.3344] manager: (tapba35ea8b-14): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Feb 19 19:52:10 compute-0 kernel: tapba35ea8b-14: entered promiscuous mode
Feb 19 19:52:10 compute-0 ovn_controller[96653]: 2026-02-19T19:52:10Z|00220|binding|INFO|Claiming lport ba35ea8b-14f8-4551-bd8f-febdd0d70b3d for this additional chassis.
Feb 19 19:52:10 compute-0 ovn_controller[96653]: 2026-02-19T19:52:10Z|00221|binding|INFO|ba35ea8b-14f8-4551-bd8f-febdd0d70b3d: Claiming fa:16:3e:5a:b5:95 10.100.0.9
Feb 19 19:52:10 compute-0 nova_compute[186662]: 2026-02-19 19:52:10.337 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.342 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:b5:95 10.100.0.9'], port_security=['fa:16:3e:5a:b5:95 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '898cad44-87cf-4f81-a0b5-52a780c02ca8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fecbde57-58e9-4df0-aab7-14888d1477cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660950d25a914555816074d9de961374', 'neutron:revision_number': '10', 'neutron:security_group_ids': '479d32ff-921d-41a5-a067-ea1f3b43c40b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b9fd371-626a-4670-b24e-8264e7c1e410, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=ba35ea8b-14f8-4551-bd8f-febdd0d70b3d) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.344 105986 INFO neutron.agent.ovn.metadata.agent [-] Port ba35ea8b-14f8-4551-bd8f-febdd0d70b3d in datapath fecbde57-58e9-4df0-aab7-14888d1477cc unbound from our chassis
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.345 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fecbde57-58e9-4df0-aab7-14888d1477cc
Feb 19 19:52:10 compute-0 ovn_controller[96653]: 2026-02-19T19:52:10Z|00222|binding|INFO|Setting lport ba35ea8b-14f8-4551-bd8f-febdd0d70b3d ovn-installed in OVS
Feb 19 19:52:10 compute-0 nova_compute[186662]: 2026-02-19 19:52:10.344 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:10 compute-0 nova_compute[186662]: 2026-02-19 19:52:10.345 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:10 compute-0 nova_compute[186662]: 2026-02-19 19:52:10.346 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.356 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f51016ab-c870-4ef7-b7bf-b4979d7f2780]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:10 compute-0 systemd-machined[156014]: New machine qemu-21-instance-0000001d.
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.379 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[98849b90-cedf-4fc3-b121-aac34aaae6ca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:10 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000001d.
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.382 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[ba3acaad-77d5-4aff-8a2b-967833052fef]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:10 compute-0 systemd-udevd[217838]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:52:10 compute-0 NetworkManager[56519]: <info>  [1771530730.3995] device (tapba35ea8b-14): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:52:10 compute-0 NetworkManager[56519]: <info>  [1771530730.4001] device (tapba35ea8b-14): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.403 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[adbe0c06-33e7-408d-bc00-eed2d9db583c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.415 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d81efec7-e86f-49c8-899d-cc27e3abeae8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfecbde57-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:02:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496968, 'reachable_time': 34968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217847, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.426 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[1f266dba-27f5-4f6a-b095-cbb1fc84abfc]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfecbde57-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496977, 'tstamp': 496977}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217848, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfecbde57-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496980, 'tstamp': 496980}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217848, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.427 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfecbde57-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:10 compute-0 nova_compute[186662]: 2026-02-19 19:52:10.429 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:10 compute-0 nova_compute[186662]: 2026-02-19 19:52:10.430 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.430 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfecbde57-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.431 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.431 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfecbde57-50, col_values=(('external_ids', {'iface-id': '5190b70d-a81e-4df1-b581-b1a2cfd96252'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.431 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:52:10 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:10.432 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f87f685f-a2de-415c-899b-80521d82441a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-fecbde57-58e9-4df0-aab7-14888d1477cc\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID fecbde57-58e9-4df0-aab7-14888d1477cc\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:11 compute-0 nova_compute[186662]: 2026-02-19 19:52:11.099 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:11 compute-0 podman[217850]: 2026-02-19 19:52:11.286764403 +0000 UTC m=+0.058918673 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Feb 19 19:52:11 compute-0 nova_compute[186662]: 2026-02-19 19:52:11.518 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:13.565 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:52:13 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:13.566 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:52:13 compute-0 nova_compute[186662]: 2026-02-19 19:52:13.598 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:14 compute-0 ovn_controller[96653]: 2026-02-19T19:52:14Z|00223|binding|INFO|Claiming lport ba35ea8b-14f8-4551-bd8f-febdd0d70b3d for this chassis.
Feb 19 19:52:14 compute-0 ovn_controller[96653]: 2026-02-19T19:52:14Z|00224|binding|INFO|ba35ea8b-14f8-4551-bd8f-febdd0d70b3d: Claiming fa:16:3e:5a:b5:95 10.100.0.9
Feb 19 19:52:14 compute-0 ovn_controller[96653]: 2026-02-19T19:52:14Z|00225|binding|INFO|Setting lport ba35ea8b-14f8-4551-bd8f-febdd0d70b3d up in Southbound
Feb 19 19:52:15 compute-0 nova_compute[186662]: 2026-02-19 19:52:15.419 186666 INFO nova.compute.manager [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Post operation of migration started
Feb 19 19:52:15 compute-0 nova_compute[186662]: 2026-02-19 19:52:15.419 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:15 compute-0 nova_compute[186662]: 2026-02-19 19:52:15.634 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:15 compute-0 nova_compute[186662]: 2026-02-19 19:52:15.635 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:15 compute-0 nova_compute[186662]: 2026-02-19 19:52:15.700 186666 DEBUG oslo_concurrency.lockutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-898cad44-87cf-4f81-a0b5-52a780c02ca8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:52:15 compute-0 nova_compute[186662]: 2026-02-19 19:52:15.700 186666 DEBUG oslo_concurrency.lockutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-898cad44-87cf-4f81-a0b5-52a780c02ca8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:52:15 compute-0 nova_compute[186662]: 2026-02-19 19:52:15.700 186666 DEBUG nova.network.neutron [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:52:16 compute-0 nova_compute[186662]: 2026-02-19 19:52:16.100 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:16 compute-0 podman[217883]: 2026-02-19 19:52:16.28038095 +0000 UTC m=+0.053889521 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.component=ubi9-minimal-container)
Feb 19 19:52:16 compute-0 nova_compute[186662]: 2026-02-19 19:52:16.362 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:16 compute-0 nova_compute[186662]: 2026-02-19 19:52:16.520 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:16 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:16.568 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:16 compute-0 nova_compute[186662]: 2026-02-19 19:52:16.820 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:16 compute-0 nova_compute[186662]: 2026-02-19 19:52:16.949 186666 DEBUG nova.network.neutron [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Updating instance_info_cache with network_info: [{"id": "ba35ea8b-14f8-4551-bd8f-febdd0d70b3d", "address": "fa:16:3e:5a:b5:95", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba35ea8b-14", "ovs_interfaceid": "ba35ea8b-14f8-4551-bd8f-febdd0d70b3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:52:17 compute-0 nova_compute[186662]: 2026-02-19 19:52:17.455 186666 DEBUG oslo_concurrency.lockutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-898cad44-87cf-4f81-a0b5-52a780c02ca8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:52:17 compute-0 nova_compute[186662]: 2026-02-19 19:52:17.980 186666 DEBUG oslo_concurrency.lockutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:17 compute-0 nova_compute[186662]: 2026-02-19 19:52:17.981 186666 DEBUG oslo_concurrency.lockutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:17 compute-0 nova_compute[186662]: 2026-02-19 19:52:17.981 186666 DEBUG oslo_concurrency.lockutils [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:17 compute-0 nova_compute[186662]: 2026-02-19 19:52:17.987 186666 INFO nova.virt.libvirt.driver [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 19 19:52:17 compute-0 virtqemud[186157]: Domain id=21 name='instance-0000001d' uuid=898cad44-87cf-4f81-a0b5-52a780c02ca8 is tainted: custom-monitor
Feb 19 19:52:18 compute-0 podman[217906]: 2026-02-19 19:52:18.323338991 +0000 UTC m=+0.093425011 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 19 19:52:18 compute-0 nova_compute[186662]: 2026-02-19 19:52:18.994 186666 INFO nova.virt.libvirt.driver [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 19 19:52:20 compute-0 nova_compute[186662]: 2026-02-19 19:52:19.999 186666 INFO nova.virt.libvirt.driver [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 19 19:52:20 compute-0 nova_compute[186662]: 2026-02-19 19:52:20.003 186666 DEBUG nova.compute.manager [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:52:20 compute-0 nova_compute[186662]: 2026-02-19 19:52:20.512 186666 DEBUG nova.objects.instance [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Feb 19 19:52:21 compute-0 nova_compute[186662]: 2026-02-19 19:52:21.103 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:21 compute-0 nova_compute[186662]: 2026-02-19 19:52:21.524 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:21 compute-0 nova_compute[186662]: 2026-02-19 19:52:21.594 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:22 compute-0 podman[217933]: 2026-02-19 19:52:22.270723737 +0000 UTC m=+0.052351772 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:52:22 compute-0 nova_compute[186662]: 2026-02-19 19:52:22.312 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:22 compute-0 nova_compute[186662]: 2026-02-19 19:52:22.312 186666 WARNING neutronclient.v2_0.client [None req-a5abe75f-51b4-4ba2-b128-c006a5f53e11 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:26 compute-0 nova_compute[186662]: 2026-02-19 19:52:26.105 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:26 compute-0 nova_compute[186662]: 2026-02-19 19:52:26.524 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.149 186666 DEBUG oslo_concurrency.lockutils [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Acquiring lock "898cad44-87cf-4f81-a0b5-52a780c02ca8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.150 186666 DEBUG oslo_concurrency.lockutils [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "898cad44-87cf-4f81-a0b5-52a780c02ca8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.150 186666 DEBUG oslo_concurrency.lockutils [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Acquiring lock "898cad44-87cf-4f81-a0b5-52a780c02ca8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.150 186666 DEBUG oslo_concurrency.lockutils [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "898cad44-87cf-4f81-a0b5-52a780c02ca8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.150 186666 DEBUG oslo_concurrency.lockutils [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "898cad44-87cf-4f81-a0b5-52a780c02ca8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.161 186666 INFO nova.compute.manager [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Terminating instance
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.675 186666 DEBUG nova.compute.manager [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:52:27 compute-0 kernel: tapba35ea8b-14 (unregistering): left promiscuous mode
Feb 19 19:52:27 compute-0 NetworkManager[56519]: <info>  [1771530747.7000] device (tapba35ea8b-14): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.708 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:27 compute-0 ovn_controller[96653]: 2026-02-19T19:52:27Z|00226|binding|INFO|Releasing lport ba35ea8b-14f8-4551-bd8f-febdd0d70b3d from this chassis (sb_readonly=0)
Feb 19 19:52:27 compute-0 ovn_controller[96653]: 2026-02-19T19:52:27Z|00227|binding|INFO|Setting lport ba35ea8b-14f8-4551-bd8f-febdd0d70b3d down in Southbound
Feb 19 19:52:27 compute-0 ovn_controller[96653]: 2026-02-19T19:52:27Z|00228|binding|INFO|Removing iface tapba35ea8b-14 ovn-installed in OVS
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.710 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.715 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.716 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:b5:95 10.100.0.9'], port_security=['fa:16:3e:5a:b5:95 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '898cad44-87cf-4f81-a0b5-52a780c02ca8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fecbde57-58e9-4df0-aab7-14888d1477cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660950d25a914555816074d9de961374', 'neutron:revision_number': '14', 'neutron:security_group_ids': '479d32ff-921d-41a5-a067-ea1f3b43c40b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b9fd371-626a-4670-b24e-8264e7c1e410, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=ba35ea8b-14f8-4551-bd8f-febdd0d70b3d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.717 105986 INFO neutron.agent.ovn.metadata.agent [-] Port ba35ea8b-14f8-4551-bd8f-febdd0d70b3d in datapath fecbde57-58e9-4df0-aab7-14888d1477cc unbound from our chassis
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.718 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fecbde57-58e9-4df0-aab7-14888d1477cc
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.731 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[6778ae29-0576-4cd4-9a4d-5afc00b76696]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.749 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[029b6721-da59-4a68-a43d-73c0324d8394]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.751 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[7c815cc0-abb6-41e4-aee6-46016eca0870]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:27 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Feb 19 19:52:27 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001d.scope: Consumed 2.057s CPU time.
Feb 19 19:52:27 compute-0 systemd-machined[156014]: Machine qemu-21-instance-0000001d terminated.
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.775 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[1d168d8f-37d7-4516-9772-221de1d6970c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.786 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[884f9a19-8d1e-4177-b944-d8ed416ed5f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfecbde57-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:02:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496968, 'reachable_time': 34968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217968, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.795 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[8027b1ef-440a-4a1a-beab-e12c337d86c9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfecbde57-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496977, 'tstamp': 496977}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217969, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfecbde57-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496980, 'tstamp': 496980}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217969, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.796 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfecbde57-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.798 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.801 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.802 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfecbde57-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.802 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.802 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfecbde57-50, col_values=(('external_ids', {'iface-id': '5190b70d-a81e-4df1-b581-b1a2cfd96252'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.802 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:52:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:27.803 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[acf5f124-8536-4dea-a2b8-dd32fed5f896]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-fecbde57-58e9-4df0-aab7-14888d1477cc\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID fecbde57-58e9-4df0-aab7-14888d1477cc\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.880 186666 DEBUG nova.compute.manager [req-794d120d-f009-4abd-9029-4540289ddb79 req-cfb16950-884b-492d-aa52-b93adb33ad87 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Received event network-vif-unplugged-ba35ea8b-14f8-4551-bd8f-febdd0d70b3d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.881 186666 DEBUG oslo_concurrency.lockutils [req-794d120d-f009-4abd-9029-4540289ddb79 req-cfb16950-884b-492d-aa52-b93adb33ad87 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "898cad44-87cf-4f81-a0b5-52a780c02ca8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.881 186666 DEBUG oslo_concurrency.lockutils [req-794d120d-f009-4abd-9029-4540289ddb79 req-cfb16950-884b-492d-aa52-b93adb33ad87 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "898cad44-87cf-4f81-a0b5-52a780c02ca8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.881 186666 DEBUG oslo_concurrency.lockutils [req-794d120d-f009-4abd-9029-4540289ddb79 req-cfb16950-884b-492d-aa52-b93adb33ad87 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "898cad44-87cf-4f81-a0b5-52a780c02ca8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.881 186666 DEBUG nova.compute.manager [req-794d120d-f009-4abd-9029-4540289ddb79 req-cfb16950-884b-492d-aa52-b93adb33ad87 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] No waiting events found dispatching network-vif-unplugged-ba35ea8b-14f8-4551-bd8f-febdd0d70b3d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.881 186666 DEBUG nova.compute.manager [req-794d120d-f009-4abd-9029-4540289ddb79 req-cfb16950-884b-492d-aa52-b93adb33ad87 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Received event network-vif-unplugged-ba35ea8b-14f8-4551-bd8f-febdd0d70b3d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.892 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.896 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.930 186666 INFO nova.virt.libvirt.driver [-] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Instance destroyed successfully.
Feb 19 19:52:27 compute-0 nova_compute[186662]: 2026-02-19 19:52:27.931 186666 DEBUG nova.objects.instance [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lazy-loading 'resources' on Instance uuid 898cad44-87cf-4f81-a0b5-52a780c02ca8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.437 186666 DEBUG nova.virt.libvirt.vif [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-02-19T19:50:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-545163397',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-5451633',id=29,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:50:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='660950d25a914555816074d9de961374',ramdisk_id='',reservation_id='r-094cl00u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',clean_attempts='1',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:52:21Z,user_data=None,user_id='a7713a2fa0c241d2a75d4f3d928cdc1f',uuid=898cad44-87cf-4f81-a0b5-52a780c02ca8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ba35ea8b-14f8-4551-bd8f-febdd0d70b3d", "address": "fa:16:3e:5a:b5:95", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba35ea8b-14", "ovs_interfaceid": "ba35ea8b-14f8-4551-bd8f-febdd0d70b3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.438 186666 DEBUG nova.network.os_vif_util [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Converting VIF {"id": "ba35ea8b-14f8-4551-bd8f-febdd0d70b3d", "address": "fa:16:3e:5a:b5:95", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba35ea8b-14", "ovs_interfaceid": "ba35ea8b-14f8-4551-bd8f-febdd0d70b3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.438 186666 DEBUG nova.network.os_vif_util [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5a:b5:95,bridge_name='br-int',has_traffic_filtering=True,id=ba35ea8b-14f8-4551-bd8f-febdd0d70b3d,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapba35ea8b-14') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.438 186666 DEBUG os_vif [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:b5:95,bridge_name='br-int',has_traffic_filtering=True,id=ba35ea8b-14f8-4551-bd8f-febdd0d70b3d,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapba35ea8b-14') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.440 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.440 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba35ea8b-14, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.474 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.476 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.477 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.477 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=44a0ef84-1093-4f21-ab32-60ae0cc7c093) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.477 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.478 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.480 186666 INFO os_vif [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:b5:95,bridge_name='br-int',has_traffic_filtering=True,id=ba35ea8b-14f8-4551-bd8f-febdd0d70b3d,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapba35ea8b-14')
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.480 186666 INFO nova.virt.libvirt.driver [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Deleting instance files /var/lib/nova/instances/898cad44-87cf-4f81-a0b5-52a780c02ca8_del
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.481 186666 INFO nova.virt.libvirt.driver [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Deletion of /var/lib/nova/instances/898cad44-87cf-4f81-a0b5-52a780c02ca8_del complete
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.992 186666 INFO nova.compute.manager [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Took 1.32 seconds to destroy the instance on the hypervisor.
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.992 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.993 186666 DEBUG nova.compute.manager [-] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.993 186666 DEBUG nova.network.neutron [-] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:52:28 compute-0 nova_compute[186662]: 2026-02-19 19:52:28.993 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:29 compute-0 nova_compute[186662]: 2026-02-19 19:52:29.305 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:29 compute-0 nova_compute[186662]: 2026-02-19 19:52:29.676 186666 DEBUG nova.compute.manager [req-aba6218a-45cc-4a1e-844f-44b0387a24ee req-24edadfd-8cb6-491c-a809-49f82d9c63d3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Received event network-vif-deleted-ba35ea8b-14f8-4551-bd8f-febdd0d70b3d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:52:29 compute-0 nova_compute[186662]: 2026-02-19 19:52:29.677 186666 INFO nova.compute.manager [req-aba6218a-45cc-4a1e-844f-44b0387a24ee req-24edadfd-8cb6-491c-a809-49f82d9c63d3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Neutron deleted interface ba35ea8b-14f8-4551-bd8f-febdd0d70b3d; detaching it from the instance and deleting it from the info cache
Feb 19 19:52:29 compute-0 nova_compute[186662]: 2026-02-19 19:52:29.677 186666 DEBUG nova.network.neutron [req-aba6218a-45cc-4a1e-844f-44b0387a24ee req-24edadfd-8cb6-491c-a809-49f82d9c63d3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:52:29 compute-0 podman[196025]: time="2026-02-19T19:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:52:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:52:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2666 "" "Go-http-client/1.1"
Feb 19 19:52:29 compute-0 sshd-session[217987]: Invalid user httpd from 189.165.79.177 port 39224
Feb 19 19:52:29 compute-0 sshd-session[217987]: Received disconnect from 189.165.79.177 port 39224:11: Bye Bye [preauth]
Feb 19 19:52:29 compute-0 sshd-session[217987]: Disconnected from invalid user httpd 189.165.79.177 port 39224 [preauth]
Feb 19 19:52:29 compute-0 nova_compute[186662]: 2026-02-19 19:52:29.933 186666 DEBUG nova.compute.manager [req-b5c2a591-4672-4ca2-9f75-a639e8b4597d req-e77aa0b7-7f02-4354-9f5f-b3e3f6d7a62c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Received event network-vif-unplugged-ba35ea8b-14f8-4551-bd8f-febdd0d70b3d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:52:29 compute-0 nova_compute[186662]: 2026-02-19 19:52:29.934 186666 DEBUG oslo_concurrency.lockutils [req-b5c2a591-4672-4ca2-9f75-a639e8b4597d req-e77aa0b7-7f02-4354-9f5f-b3e3f6d7a62c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "898cad44-87cf-4f81-a0b5-52a780c02ca8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:29 compute-0 nova_compute[186662]: 2026-02-19 19:52:29.934 186666 DEBUG oslo_concurrency.lockutils [req-b5c2a591-4672-4ca2-9f75-a639e8b4597d req-e77aa0b7-7f02-4354-9f5f-b3e3f6d7a62c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "898cad44-87cf-4f81-a0b5-52a780c02ca8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:29 compute-0 nova_compute[186662]: 2026-02-19 19:52:29.934 186666 DEBUG oslo_concurrency.lockutils [req-b5c2a591-4672-4ca2-9f75-a639e8b4597d req-e77aa0b7-7f02-4354-9f5f-b3e3f6d7a62c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "898cad44-87cf-4f81-a0b5-52a780c02ca8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:29 compute-0 nova_compute[186662]: 2026-02-19 19:52:29.934 186666 DEBUG nova.compute.manager [req-b5c2a591-4672-4ca2-9f75-a639e8b4597d req-e77aa0b7-7f02-4354-9f5f-b3e3f6d7a62c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] No waiting events found dispatching network-vif-unplugged-ba35ea8b-14f8-4551-bd8f-febdd0d70b3d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:52:29 compute-0 nova_compute[186662]: 2026-02-19 19:52:29.934 186666 DEBUG nova.compute.manager [req-b5c2a591-4672-4ca2-9f75-a639e8b4597d req-e77aa0b7-7f02-4354-9f5f-b3e3f6d7a62c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Received event network-vif-unplugged-ba35ea8b-14f8-4551-bd8f-febdd0d70b3d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:52:30 compute-0 nova_compute[186662]: 2026-02-19 19:52:30.100 186666 DEBUG nova.network.neutron [-] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:52:30 compute-0 nova_compute[186662]: 2026-02-19 19:52:30.185 186666 DEBUG nova.compute.manager [req-aba6218a-45cc-4a1e-844f-44b0387a24ee req-24edadfd-8cb6-491c-a809-49f82d9c63d3 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Detach interface failed, port_id=ba35ea8b-14f8-4551-bd8f-febdd0d70b3d, reason: Instance 898cad44-87cf-4f81-a0b5-52a780c02ca8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:52:30 compute-0 nova_compute[186662]: 2026-02-19 19:52:30.606 186666 INFO nova.compute.manager [-] [instance: 898cad44-87cf-4f81-a0b5-52a780c02ca8] Took 1.61 seconds to deallocate network for instance.
Feb 19 19:52:31 compute-0 nova_compute[186662]: 2026-02-19 19:52:31.129 186666 DEBUG oslo_concurrency.lockutils [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:31 compute-0 nova_compute[186662]: 2026-02-19 19:52:31.130 186666 DEBUG oslo_concurrency.lockutils [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:31 compute-0 nova_compute[186662]: 2026-02-19 19:52:31.135 186666 DEBUG oslo_concurrency.lockutils [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:31 compute-0 nova_compute[186662]: 2026-02-19 19:52:31.158 186666 INFO nova.scheduler.client.report [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Deleted allocations for instance 898cad44-87cf-4f81-a0b5-52a780c02ca8
Feb 19 19:52:31 compute-0 openstack_network_exporter[198916]: ERROR   19:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:52:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:52:31 compute-0 openstack_network_exporter[198916]: ERROR   19:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:52:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:52:31 compute-0 nova_compute[186662]: 2026-02-19 19:52:31.526 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:31 compute-0 nova_compute[186662]: 2026-02-19 19:52:31.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:52:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:32.162 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:32.162 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:32.163 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:32 compute-0 nova_compute[186662]: 2026-02-19 19:52:32.188 186666 DEBUG oslo_concurrency.lockutils [None req-9a1a44d4-2427-463a-a303-c45b55fb73ea a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "898cad44-87cf-4f81-a0b5-52a780c02ca8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.039s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:32 compute-0 nova_compute[186662]: 2026-02-19 19:52:32.547 186666 DEBUG oslo_concurrency.lockutils [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Acquiring lock "3e898ccb-66b6-4591-8f6f-32c7ba62d150" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:32 compute-0 nova_compute[186662]: 2026-02-19 19:52:32.548 186666 DEBUG oslo_concurrency.lockutils [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "3e898ccb-66b6-4591-8f6f-32c7ba62d150" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:32 compute-0 nova_compute[186662]: 2026-02-19 19:52:32.548 186666 DEBUG oslo_concurrency.lockutils [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Acquiring lock "3e898ccb-66b6-4591-8f6f-32c7ba62d150-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:32 compute-0 nova_compute[186662]: 2026-02-19 19:52:32.548 186666 DEBUG oslo_concurrency.lockutils [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "3e898ccb-66b6-4591-8f6f-32c7ba62d150-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:32 compute-0 nova_compute[186662]: 2026-02-19 19:52:32.549 186666 DEBUG oslo_concurrency.lockutils [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "3e898ccb-66b6-4591-8f6f-32c7ba62d150-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:32 compute-0 nova_compute[186662]: 2026-02-19 19:52:32.560 186666 INFO nova.compute.manager [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Terminating instance
Feb 19 19:52:32 compute-0 nova_compute[186662]: 2026-02-19 19:52:32.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.077 186666 DEBUG nova.compute.manager [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:52:33 compute-0 kernel: tap0f888555-8d (unregistering): left promiscuous mode
Feb 19 19:52:33 compute-0 NetworkManager[56519]: <info>  [1771530753.1048] device (tap0f888555-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:52:33 compute-0 ovn_controller[96653]: 2026-02-19T19:52:33Z|00229|binding|INFO|Releasing lport 0f888555-8d11-4dff-8761-be5721afca44 from this chassis (sb_readonly=0)
Feb 19 19:52:33 compute-0 ovn_controller[96653]: 2026-02-19T19:52:33Z|00230|binding|INFO|Setting lport 0f888555-8d11-4dff-8761-be5721afca44 down in Southbound
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.110 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 ovn_controller[96653]: 2026-02-19T19:52:33Z|00231|binding|INFO|Removing iface tap0f888555-8d ovn-installed in OVS
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.111 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.111 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.116 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:70:5c 10.100.0.14'], port_security=['fa:16:3e:ba:70:5c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3e898ccb-66b6-4591-8f6f-32c7ba62d150', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fecbde57-58e9-4df0-aab7-14888d1477cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660950d25a914555816074d9de961374', 'neutron:revision_number': '14', 'neutron:security_group_ids': '479d32ff-921d-41a5-a067-ea1f3b43c40b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b9fd371-626a-4670-b24e-8264e7c1e410, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=0f888555-8d11-4dff-8761-be5721afca44) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.116 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 0f888555-8d11-4dff-8761-be5721afca44 in datapath fecbde57-58e9-4df0-aab7-14888d1477cc unbound from our chassis
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.117 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fecbde57-58e9-4df0-aab7-14888d1477cc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.118 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d5f5df7a-8c95-4a10-a029-f64bdc5b01a8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.118 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc namespace which is not needed anymore
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.128 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Feb 19 19:52:33 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001c.scope: Consumed 3.105s CPU time.
Feb 19 19:52:33 compute-0 systemd-machined[156014]: Machine qemu-20-instance-0000001c terminated.
Feb 19 19:52:33 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[217672]: [NOTICE]   (217692) : haproxy version is 3.0.5-8e879a5
Feb 19 19:52:33 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[217672]: [NOTICE]   (217692) : path to executable is /usr/sbin/haproxy
Feb 19 19:52:33 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[217672]: [WARNING]  (217692) : Exiting Master process...
Feb 19 19:52:33 compute-0 podman[218017]: 2026-02-19 19:52:33.224759789 +0000 UTC m=+0.025623913 container kill bee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.225 186666 DEBUG nova.compute.manager [req-48129ae7-c4b5-472d-8348-367060c2b6d0 req-9531d05d-f5fc-4f15-8cfb-d79e3eed89c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Received event network-vif-unplugged-0f888555-8d11-4dff-8761-be5721afca44 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:52:33 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[217672]: [ALERT]    (217692) : Current worker (217696) exited with code 143 (Terminated)
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.226 186666 DEBUG oslo_concurrency.lockutils [req-48129ae7-c4b5-472d-8348-367060c2b6d0 req-9531d05d-f5fc-4f15-8cfb-d79e3eed89c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "3e898ccb-66b6-4591-8f6f-32c7ba62d150-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:33 compute-0 neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc[217672]: [WARNING]  (217692) : All workers exited. Exiting... (0)
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.226 186666 DEBUG oslo_concurrency.lockutils [req-48129ae7-c4b5-472d-8348-367060c2b6d0 req-9531d05d-f5fc-4f15-8cfb-d79e3eed89c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3e898ccb-66b6-4591-8f6f-32c7ba62d150-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.226 186666 DEBUG oslo_concurrency.lockutils [req-48129ae7-c4b5-472d-8348-367060c2b6d0 req-9531d05d-f5fc-4f15-8cfb-d79e3eed89c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3e898ccb-66b6-4591-8f6f-32c7ba62d150-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.226 186666 DEBUG nova.compute.manager [req-48129ae7-c4b5-472d-8348-367060c2b6d0 req-9531d05d-f5fc-4f15-8cfb-d79e3eed89c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] No waiting events found dispatching network-vif-unplugged-0f888555-8d11-4dff-8761-be5721afca44 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.227 186666 DEBUG nova.compute.manager [req-48129ae7-c4b5-472d-8348-367060c2b6d0 req-9531d05d-f5fc-4f15-8cfb-d79e3eed89c5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Received event network-vif-unplugged-0f888555-8d11-4dff-8761-be5721afca44 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:52:33 compute-0 systemd[1]: libpod-bee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc.scope: Deactivated successfully.
Feb 19 19:52:33 compute-0 podman[218032]: 2026-02-19 19:52:33.271959676 +0000 UTC m=+0.028035812 container died bee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:52:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc-userdata-shm.mount: Deactivated successfully.
Feb 19 19:52:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-6a5365a9830605f29216d04ea7c386785277cb90da12c9d7eeeeed426d80a3e8-merged.mount: Deactivated successfully.
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.316 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 podman[218032]: 2026-02-19 19:52:33.323062758 +0000 UTC m=+0.079138884 container cleanup bee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 19:52:33 compute-0 systemd[1]: libpod-conmon-bee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc.scope: Deactivated successfully.
Feb 19 19:52:33 compute-0 podman[218034]: 2026-02-19 19:52:33.337187401 +0000 UTC m=+0.088325197 container remove bee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.341 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[5e0242df-d093-4cd0-bbf6-582a82ee34bd]: (4, ("Thu Feb 19 07:52:33 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc (bee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc)\nbee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc\nThu Feb 19 07:52:33 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc (bee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc)\nbee092eb444e2bbf76950bd3a2eb75874e1be3778cd2d7ccd8d2502f886179bc\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.342 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[78840f47-2fb2-432f-9665-a6c11e4bbb29]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.343 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fecbde57-58e9-4df0-aab7-14888d1477cc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.343 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ea6c27cb-0436-4f28-9295-2b2c83a18a71]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.344 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfecbde57-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.345 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 kernel: tapfecbde57-50: left promiscuous mode
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.352 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.353 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.354 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c9246655-10f6-4b69-a915-5b4366a225b9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.361 186666 INFO nova.virt.libvirt.driver [-] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Instance destroyed successfully.
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.362 186666 DEBUG nova.objects.instance [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lazy-loading 'resources' on Instance uuid 3e898ccb-66b6-4591-8f6f-32c7ba62d150 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.365 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c6a87b-e570-47dd-9b84-f2bdda8fbfc8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.365 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[725ec922-cde4-4c79-a5f4-b550fd62ae54]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.379 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a15d8c9d-7b09-4ea5-b87d-5e3d3f0d3ca8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496963, 'reachable_time': 38973, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218085, 'error': None, 'target': 'ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:33 compute-0 systemd[1]: run-netns-ovnmeta\x2dfecbde57\x2d58e9\x2d4df0\x2daab7\x2d14888d1477cc.mount: Deactivated successfully.
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.382 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fecbde57-58e9-4df0-aab7-14888d1477cc deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:52:33 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:33.382 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[a546cba7-ce28-406c-b392-a0f217f8e7aa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.477 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.867 186666 DEBUG nova.virt.libvirt.vif [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-02-19T19:50:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1020024042',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1020024',id=28,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:50:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='660950d25a914555816074d9de961374',ramdisk_id='',reservation_id='r-ao0zy72u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',clean_attempts='1',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1016029365-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:51:50Z,user_data=None,user_id='a7713a2fa0c241d2a75d4f3d928cdc1f',uuid=3e898ccb-66b6-4591-8f6f-32c7ba62d150,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0f888555-8d11-4dff-8761-be5721afca44", "address": "fa:16:3e:ba:70:5c", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f888555-8d", "ovs_interfaceid": "0f888555-8d11-4dff-8761-be5721afca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.868 186666 DEBUG nova.network.os_vif_util [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Converting VIF {"id": "0f888555-8d11-4dff-8761-be5721afca44", "address": "fa:16:3e:ba:70:5c", "network": {"id": "fecbde57-58e9-4df0-aab7-14888d1477cc", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-17879669-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09b52bffe1f548839b74e94d80ad9eb7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f888555-8d", "ovs_interfaceid": "0f888555-8d11-4dff-8761-be5721afca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.868 186666 DEBUG nova.network.os_vif_util [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ba:70:5c,bridge_name='br-int',has_traffic_filtering=True,id=0f888555-8d11-4dff-8761-be5721afca44,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f888555-8d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.868 186666 DEBUG os_vif [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:70:5c,bridge_name='br-int',has_traffic_filtering=True,id=0f888555-8d11-4dff-8761-be5721afca44,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f888555-8d') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.869 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.870 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f888555-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.871 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.872 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.874 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.875 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.876 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d6f81bd2-af7b-448b-a64e-ea8ca4ebf62b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.877 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.879 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.881 186666 INFO os_vif [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:70:5c,bridge_name='br-int',has_traffic_filtering=True,id=0f888555-8d11-4dff-8761-be5721afca44,network=Network(fecbde57-58e9-4df0-aab7-14888d1477cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f888555-8d')
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.882 186666 INFO nova.virt.libvirt.driver [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Deleting instance files /var/lib/nova/instances/3e898ccb-66b6-4591-8f6f-32c7ba62d150_del
Feb 19 19:52:33 compute-0 nova_compute[186662]: 2026-02-19 19:52:33.883 186666 INFO nova.virt.libvirt.driver [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Deletion of /var/lib/nova/instances/3e898ccb-66b6-4591-8f6f-32c7ba62d150_del complete
Feb 19 19:52:34 compute-0 nova_compute[186662]: 2026-02-19 19:52:34.396 186666 INFO nova.compute.manager [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Took 1.32 seconds to destroy the instance on the hypervisor.
Feb 19 19:52:34 compute-0 nova_compute[186662]: 2026-02-19 19:52:34.396 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:52:34 compute-0 nova_compute[186662]: 2026-02-19 19:52:34.397 186666 DEBUG nova.compute.manager [-] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:52:34 compute-0 nova_compute[186662]: 2026-02-19 19:52:34.397 186666 DEBUG nova.network.neutron [-] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:52:34 compute-0 nova_compute[186662]: 2026-02-19 19:52:34.397 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:34 compute-0 nova_compute[186662]: 2026-02-19 19:52:34.742 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:52:35 compute-0 nova_compute[186662]: 2026-02-19 19:52:35.040 186666 DEBUG nova.compute.manager [req-4dc7a875-4009-489a-84e5-39c45c459835 req-3e72bac2-edc6-449d-8099-73d2a3df2294 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Received event network-vif-deleted-0f888555-8d11-4dff-8761-be5721afca44 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:52:35 compute-0 nova_compute[186662]: 2026-02-19 19:52:35.040 186666 INFO nova.compute.manager [req-4dc7a875-4009-489a-84e5-39c45c459835 req-3e72bac2-edc6-449d-8099-73d2a3df2294 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Neutron deleted interface 0f888555-8d11-4dff-8761-be5721afca44; detaching it from the instance and deleting it from the info cache
Feb 19 19:52:35 compute-0 nova_compute[186662]: 2026-02-19 19:52:35.040 186666 DEBUG nova.network.neutron [req-4dc7a875-4009-489a-84e5-39c45c459835 req-3e72bac2-edc6-449d-8099-73d2a3df2294 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:52:35 compute-0 nova_compute[186662]: 2026-02-19 19:52:35.296 186666 DEBUG nova.compute.manager [req-4f52aa6d-2780-4207-87f4-58008a417a79 req-b8c26c7b-6e4f-442b-a495-ce807145d3f5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Received event network-vif-unplugged-0f888555-8d11-4dff-8761-be5721afca44 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:52:35 compute-0 nova_compute[186662]: 2026-02-19 19:52:35.297 186666 DEBUG oslo_concurrency.lockutils [req-4f52aa6d-2780-4207-87f4-58008a417a79 req-b8c26c7b-6e4f-442b-a495-ce807145d3f5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "3e898ccb-66b6-4591-8f6f-32c7ba62d150-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:35 compute-0 nova_compute[186662]: 2026-02-19 19:52:35.297 186666 DEBUG oslo_concurrency.lockutils [req-4f52aa6d-2780-4207-87f4-58008a417a79 req-b8c26c7b-6e4f-442b-a495-ce807145d3f5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3e898ccb-66b6-4591-8f6f-32c7ba62d150-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:35 compute-0 nova_compute[186662]: 2026-02-19 19:52:35.297 186666 DEBUG oslo_concurrency.lockutils [req-4f52aa6d-2780-4207-87f4-58008a417a79 req-b8c26c7b-6e4f-442b-a495-ce807145d3f5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3e898ccb-66b6-4591-8f6f-32c7ba62d150-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:35 compute-0 nova_compute[186662]: 2026-02-19 19:52:35.297 186666 DEBUG nova.compute.manager [req-4f52aa6d-2780-4207-87f4-58008a417a79 req-b8c26c7b-6e4f-442b-a495-ce807145d3f5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] No waiting events found dispatching network-vif-unplugged-0f888555-8d11-4dff-8761-be5721afca44 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:52:35 compute-0 nova_compute[186662]: 2026-02-19 19:52:35.298 186666 DEBUG nova.compute.manager [req-4f52aa6d-2780-4207-87f4-58008a417a79 req-b8c26c7b-6e4f-442b-a495-ce807145d3f5 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Received event network-vif-unplugged-0f888555-8d11-4dff-8761-be5721afca44 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:52:35 compute-0 nova_compute[186662]: 2026-02-19 19:52:35.502 186666 DEBUG nova.network.neutron [-] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:52:35 compute-0 nova_compute[186662]: 2026-02-19 19:52:35.548 186666 DEBUG nova.compute.manager [req-4dc7a875-4009-489a-84e5-39c45c459835 req-3e72bac2-edc6-449d-8099-73d2a3df2294 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Detach interface failed, port_id=0f888555-8d11-4dff-8761-be5721afca44, reason: Instance 3e898ccb-66b6-4591-8f6f-32c7ba62d150 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:52:36 compute-0 nova_compute[186662]: 2026-02-19 19:52:36.008 186666 INFO nova.compute.manager [-] [instance: 3e898ccb-66b6-4591-8f6f-32c7ba62d150] Took 1.61 seconds to deallocate network for instance.
Feb 19 19:52:36 compute-0 nova_compute[186662]: 2026-02-19 19:52:36.523 186666 DEBUG oslo_concurrency.lockutils [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:36 compute-0 nova_compute[186662]: 2026-02-19 19:52:36.523 186666 DEBUG oslo_concurrency.lockutils [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:36 compute-0 nova_compute[186662]: 2026-02-19 19:52:36.527 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:36 compute-0 nova_compute[186662]: 2026-02-19 19:52:36.528 186666 DEBUG oslo_concurrency.lockutils [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:36 compute-0 nova_compute[186662]: 2026-02-19 19:52:36.551 186666 INFO nova.scheduler.client.report [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Deleted allocations for instance 3e898ccb-66b6-4591-8f6f-32c7ba62d150
Feb 19 19:52:36 compute-0 nova_compute[186662]: 2026-02-19 19:52:36.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:52:36 compute-0 nova_compute[186662]: 2026-02-19 19:52:36.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:52:36 compute-0 nova_compute[186662]: 2026-02-19 19:52:36.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:52:37 compute-0 nova_compute[186662]: 2026-02-19 19:52:37.577 186666 DEBUG oslo_concurrency.lockutils [None req-257bf829-e673-41bb-a72d-583fb5b8ba2c a7713a2fa0c241d2a75d4f3d928cdc1f 660950d25a914555816074d9de961374 - - default default] Lock "3e898ccb-66b6-4591-8f6f-32c7ba62d150" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.029s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:38 compute-0 nova_compute[186662]: 2026-02-19 19:52:38.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:52:38 compute-0 nova_compute[186662]: 2026-02-19 19:52:38.878 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:39 compute-0 nova_compute[186662]: 2026-02-19 19:52:39.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:52:40 compute-0 nova_compute[186662]: 2026-02-19 19:52:40.083 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:40 compute-0 nova_compute[186662]: 2026-02-19 19:52:40.084 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:40 compute-0 nova_compute[186662]: 2026-02-19 19:52:40.084 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:40 compute-0 nova_compute[186662]: 2026-02-19 19:52:40.084 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:52:40 compute-0 nova_compute[186662]: 2026-02-19 19:52:40.197 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:52:40 compute-0 nova_compute[186662]: 2026-02-19 19:52:40.198 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:52:40 compute-0 nova_compute[186662]: 2026-02-19 19:52:40.210 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:52:40 compute-0 nova_compute[186662]: 2026-02-19 19:52:40.211 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5825MB free_disk=72.96769332885742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:52:40 compute-0 nova_compute[186662]: 2026-02-19 19:52:40.211 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:52:40 compute-0 nova_compute[186662]: 2026-02-19 19:52:40.211 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:52:41 compute-0 nova_compute[186662]: 2026-02-19 19:52:41.245 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:52:41 compute-0 nova_compute[186662]: 2026-02-19 19:52:41.246 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:52:40 up  1:23,  0 user,  load average: 0.17, 0.13, 0.20\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:52:41 compute-0 nova_compute[186662]: 2026-02-19 19:52:41.291 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:52:41 compute-0 nova_compute[186662]: 2026-02-19 19:52:41.529 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:41 compute-0 nova_compute[186662]: 2026-02-19 19:52:41.801 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:52:42 compute-0 podman[218088]: 2026-02-19 19:52:42.274499868 +0000 UTC m=+0.048434168 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Feb 19 19:52:42 compute-0 nova_compute[186662]: 2026-02-19 19:52:42.310 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:52:42 compute-0 nova_compute[186662]: 2026-02-19 19:52:42.311 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:52:42 compute-0 nova_compute[186662]: 2026-02-19 19:52:42.430 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:43 compute-0 nova_compute[186662]: 2026-02-19 19:52:43.880 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:45 compute-0 nova_compute[186662]: 2026-02-19 19:52:45.311 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:52:46 compute-0 nova_compute[186662]: 2026-02-19 19:52:46.530 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:47 compute-0 podman[218109]: 2026-02-19 19:52:47.277543908 +0000 UTC m=+0.056091455 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, io.openshift.expose-services=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 19 19:52:48 compute-0 nova_compute[186662]: 2026-02-19 19:52:48.916 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:49 compute-0 podman[218132]: 2026-02-19 19:52:49.314322999 +0000 UTC m=+0.084073074 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Feb 19 19:52:51 compute-0 nova_compute[186662]: 2026-02-19 19:52:51.532 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:52.844 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:50:71 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4e2c7ed37304bc098ed5d11e1ab4d17', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ddef6f8-604e-4b41-9f83-1182f76298b6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4d5db50f-b83e-4965-bddf-4fd62a569e63) old=Port_Binding(mac=['fa:16:3e:ff:50:71'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4e2c7ed37304bc098ed5d11e1ab4d17', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:52:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:52.845 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4d5db50f-b83e-4965-bddf-4fd62a569e63 in datapath 2cb93231-0e3e-4efd-8b0c-4366500dbd16 updated
Feb 19 19:52:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:52.846 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2cb93231-0e3e-4efd-8b0c-4366500dbd16, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:52:52 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:52:52.847 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[11675bbc-400a-4d9e-8587-b7406775dd43]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:52:53 compute-0 podman[218159]: 2026-02-19 19:52:53.255360402 +0000 UTC m=+0.037138764 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:52:53 compute-0 nova_compute[186662]: 2026-02-19 19:52:53.918 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:56 compute-0 sshd-session[218184]: Invalid user ubuntu from 106.51.64.128 port 25542
Feb 19 19:52:56 compute-0 nova_compute[186662]: 2026-02-19 19:52:56.536 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:56 compute-0 sshd-session[218184]: Received disconnect from 106.51.64.128 port 25542:11: Bye Bye [preauth]
Feb 19 19:52:56 compute-0 sshd-session[218184]: Disconnected from invalid user ubuntu 106.51.64.128 port 25542 [preauth]
Feb 19 19:52:58 compute-0 nova_compute[186662]: 2026-02-19 19:52:58.919 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:52:59 compute-0 podman[196025]: time="2026-02-19T19:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:52:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:52:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2200 "" "Go-http-client/1.1"
Feb 19 19:53:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:01.333 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:1b:ee 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c746e887-e640-4b3e-894b-b895d6b510ac', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c746e887-e640-4b3e-894b-b895d6b510ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dafcf6090521473590cdb432a889739e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4754d47-5ba6-4ec2-b216-99dc11fa4a7e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e150c05d-ea6f-4b0f-97f4-8757979aeb12) old=Port_Binding(mac=['fa:16:3e:e1:1b:ee'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c746e887-e640-4b3e-894b-b895d6b510ac', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c746e887-e640-4b3e-894b-b895d6b510ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dafcf6090521473590cdb432a889739e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:53:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:01.335 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e150c05d-ea6f-4b0f-97f4-8757979aeb12 in datapath c746e887-e640-4b3e-894b-b895d6b510ac updated
Feb 19 19:53:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:01.336 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c746e887-e640-4b3e-894b-b895d6b510ac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:53:01 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:01.339 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b05da8d5-f6cd-4166-86de-0ced8a9cfd1a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:01 compute-0 openstack_network_exporter[198916]: ERROR   19:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:53:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:53:01 compute-0 openstack_network_exporter[198916]: ERROR   19:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:53:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:53:01 compute-0 nova_compute[186662]: 2026-02-19 19:53:01.575 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:03 compute-0 nova_compute[186662]: 2026-02-19 19:53:03.922 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:06 compute-0 nova_compute[186662]: 2026-02-19 19:53:06.577 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:08 compute-0 nova_compute[186662]: 2026-02-19 19:53:08.924 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:11 compute-0 nova_compute[186662]: 2026-02-19 19:53:11.578 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:13 compute-0 podman[218186]: 2026-02-19 19:53:13.291463958 +0000 UTC m=+0.071901608 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 19 19:53:13 compute-0 nova_compute[186662]: 2026-02-19 19:53:13.925 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:16 compute-0 nova_compute[186662]: 2026-02-19 19:53:16.579 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:16 compute-0 ovn_controller[96653]: 2026-02-19T19:53:16Z|00232|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 19 19:53:18 compute-0 podman[218206]: 2026-02-19 19:53:18.293045611 +0000 UTC m=+0.067496481 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, architecture=x86_64, version=9.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, vcs-type=git, io.buildah.version=1.33.7, config_id=openstack_network_exporter)
Feb 19 19:53:18 compute-0 nova_compute[186662]: 2026-02-19 19:53:18.927 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:20 compute-0 podman[218227]: 2026-02-19 19:53:20.293369877 +0000 UTC m=+0.070602108 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Feb 19 19:53:21 compute-0 nova_compute[186662]: 2026-02-19 19:53:21.580 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:23 compute-0 nova_compute[186662]: 2026-02-19 19:53:23.929 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:24 compute-0 podman[218253]: 2026-02-19 19:53:24.29289807 +0000 UTC m=+0.060536441 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:53:26 compute-0 nova_compute[186662]: 2026-02-19 19:53:26.580 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:28 compute-0 nova_compute[186662]: 2026-02-19 19:53:28.930 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:29 compute-0 podman[196025]: time="2026-02-19T19:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:53:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:53:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2194 "" "Go-http-client/1.1"
Feb 19 19:53:30 compute-0 sshd-session[218278]: Received disconnect from 96.78.175.42 port 36634:11: Bye Bye [preauth]
Feb 19 19:53:30 compute-0 sshd-session[218278]: Disconnected from authenticating user root 96.78.175.42 port 36634 [preauth]
Feb 19 19:53:30 compute-0 nova_compute[186662]: 2026-02-19 19:53:30.394 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "0b6d0709-7667-4eda-a3b1-ba3adf926203" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:53:30 compute-0 nova_compute[186662]: 2026-02-19 19:53:30.394 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:53:30 compute-0 nova_compute[186662]: 2026-02-19 19:53:30.909 186666 DEBUG nova.compute.manager [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Feb 19 19:53:31 compute-0 openstack_network_exporter[198916]: ERROR   19:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:53:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:53:31 compute-0 openstack_network_exporter[198916]: ERROR   19:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:53:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:53:31 compute-0 nova_compute[186662]: 2026-02-19 19:53:31.525 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:53:31 compute-0 nova_compute[186662]: 2026-02-19 19:53:31.526 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:53:31 compute-0 nova_compute[186662]: 2026-02-19 19:53:31.531 186666 DEBUG nova.virt.hardware [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:53:31 compute-0 nova_compute[186662]: 2026-02-19 19:53:31.532 186666 INFO nova.compute.claims [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:53:31 compute-0 nova_compute[186662]: 2026-02-19 19:53:31.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:53:31 compute-0 nova_compute[186662]: 2026-02-19 19:53:31.631 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:32.164 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:53:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:32.164 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:53:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:32.165 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:53:32 compute-0 nova_compute[186662]: 2026-02-19 19:53:32.597 186666 DEBUG nova.compute.provider_tree [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:53:33 compute-0 nova_compute[186662]: 2026-02-19 19:53:33.105 186666 DEBUG nova.scheduler.client.report [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:53:33 compute-0 nova_compute[186662]: 2026-02-19 19:53:33.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:53:33 compute-0 nova_compute[186662]: 2026-02-19 19:53:33.631 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.105s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:53:33 compute-0 nova_compute[186662]: 2026-02-19 19:53:33.632 186666 DEBUG nova.compute.manager [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Feb 19 19:53:33 compute-0 nova_compute[186662]: 2026-02-19 19:53:33.933 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:34 compute-0 nova_compute[186662]: 2026-02-19 19:53:34.150 186666 DEBUG nova.compute.manager [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Feb 19 19:53:34 compute-0 nova_compute[186662]: 2026-02-19 19:53:34.151 186666 DEBUG nova.network.neutron [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Feb 19 19:53:34 compute-0 nova_compute[186662]: 2026-02-19 19:53:34.152 186666 WARNING neutronclient.v2_0.client [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:53:34 compute-0 nova_compute[186662]: 2026-02-19 19:53:34.152 186666 WARNING neutronclient.v2_0.client [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:53:34 compute-0 nova_compute[186662]: 2026-02-19 19:53:34.664 186666 INFO nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 19 19:53:34 compute-0 nova_compute[186662]: 2026-02-19 19:53:34.929 186666 DEBUG nova.network.neutron [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Successfully created port: b8e7eb67-d798-4e84-b656-41c8f5d55d83 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Feb 19 19:53:35 compute-0 nova_compute[186662]: 2026-02-19 19:53:35.178 186666 DEBUG nova.compute.manager [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Feb 19 19:53:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:35.219 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:53:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:35.220 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:53:35 compute-0 nova_compute[186662]: 2026-02-19 19:53:35.221 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:35 compute-0 nova_compute[186662]: 2026-02-19 19:53:35.552 186666 DEBUG nova.network.neutron [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Successfully updated port: b8e7eb67-d798-4e84-b656-41c8f5d55d83 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Feb 19 19:53:35 compute-0 nova_compute[186662]: 2026-02-19 19:53:35.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:53:35 compute-0 nova_compute[186662]: 2026-02-19 19:53:35.604 186666 DEBUG nova.compute.manager [req-cd3db4a6-01fb-4cd2-8261-d7908f456c42 req-8d0192b8-753f-41fc-815a-d972bb830ea6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Received event network-changed-b8e7eb67-d798-4e84-b656-41c8f5d55d83 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:53:35 compute-0 nova_compute[186662]: 2026-02-19 19:53:35.605 186666 DEBUG nova.compute.manager [req-cd3db4a6-01fb-4cd2-8261-d7908f456c42 req-8d0192b8-753f-41fc-815a-d972bb830ea6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Refreshing instance network info cache due to event network-changed-b8e7eb67-d798-4e84-b656-41c8f5d55d83. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:53:35 compute-0 nova_compute[186662]: 2026-02-19 19:53:35.606 186666 DEBUG oslo_concurrency.lockutils [req-cd3db4a6-01fb-4cd2-8261-d7908f456c42 req-8d0192b8-753f-41fc-815a-d972bb830ea6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-0b6d0709-7667-4eda-a3b1-ba3adf926203" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:53:35 compute-0 nova_compute[186662]: 2026-02-19 19:53:35.607 186666 DEBUG oslo_concurrency.lockutils [req-cd3db4a6-01fb-4cd2-8261-d7908f456c42 req-8d0192b8-753f-41fc-815a-d972bb830ea6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-0b6d0709-7667-4eda-a3b1-ba3adf926203" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:53:35 compute-0 nova_compute[186662]: 2026-02-19 19:53:35.607 186666 DEBUG nova.network.neutron [req-cd3db4a6-01fb-4cd2-8261-d7908f456c42 req-8d0192b8-753f-41fc-815a-d972bb830ea6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Refreshing network info cache for port b8e7eb67-d798-4e84-b656-41c8f5d55d83 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.061 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "refresh_cache-0b6d0709-7667-4eda-a3b1-ba3adf926203" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.113 186666 WARNING neutronclient.v2_0.client [req-cd3db4a6-01fb-4cd2-8261-d7908f456c42 req-8d0192b8-753f-41fc-815a-d972bb830ea6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.196 186666 DEBUG nova.compute.manager [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.198 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.198 186666 INFO nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Creating image(s)
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.199 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "/var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.199 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "/var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.200 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "/var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.200 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.203 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.205 186666 DEBUG oslo_concurrency.processutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.259 186666 DEBUG oslo_concurrency.processutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.260 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.260 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.261 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.264 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.264 186666 DEBUG oslo_concurrency.processutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.316 186666 DEBUG oslo_concurrency.processutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.318 186666 DEBUG oslo_concurrency.processutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.348 186666 DEBUG oslo_concurrency.processutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.349 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.089s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.350 186666 DEBUG oslo_concurrency.processutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.394 186666 DEBUG oslo_concurrency.processutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.395 186666 DEBUG nova.virt.disk.api [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Checking if we can resize image /var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.395 186666 DEBUG oslo_concurrency.processutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.440 186666 DEBUG oslo_concurrency.processutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.441 186666 DEBUG nova.virt.disk.api [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Cannot resize image /var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.442 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.442 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Ensure instance console log exists: /var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.442 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.443 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.443 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:53:36 compute-0 nova_compute[186662]: 2026-02-19 19:53:36.636 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:37 compute-0 nova_compute[186662]: 2026-02-19 19:53:37.340 186666 DEBUG nova.network.neutron [req-cd3db4a6-01fb-4cd2-8261-d7908f456c42 req-8d0192b8-753f-41fc-815a-d972bb830ea6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:53:37 compute-0 nova_compute[186662]: 2026-02-19 19:53:37.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:53:38 compute-0 nova_compute[186662]: 2026-02-19 19:53:38.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:53:38 compute-0 nova_compute[186662]: 2026-02-19 19:53:38.577 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:53:38 compute-0 nova_compute[186662]: 2026-02-19 19:53:38.939 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:39 compute-0 nova_compute[186662]: 2026-02-19 19:53:39.317 186666 DEBUG nova.network.neutron [req-cd3db4a6-01fb-4cd2-8261-d7908f456c42 req-8d0192b8-753f-41fc-815a-d972bb830ea6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:53:39 compute-0 nova_compute[186662]: 2026-02-19 19:53:39.572 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:53:39 compute-0 nova_compute[186662]: 2026-02-19 19:53:39.823 186666 DEBUG oslo_concurrency.lockutils [req-cd3db4a6-01fb-4cd2-8261-d7908f456c42 req-8d0192b8-753f-41fc-815a-d972bb830ea6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-0b6d0709-7667-4eda-a3b1-ba3adf926203" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:53:39 compute-0 nova_compute[186662]: 2026-02-19 19:53:39.824 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquired lock "refresh_cache-0b6d0709-7667-4eda-a3b1-ba3adf926203" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:53:39 compute-0 nova_compute[186662]: 2026-02-19 19:53:39.825 186666 DEBUG nova.network.neutron [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.082 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.082 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.423 186666 DEBUG nova.network.neutron [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.593 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.594 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.594 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.594 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.708 186666 WARNING neutronclient.v2_0.client [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.749 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.750 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.761 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.011s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.762 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5825MB free_disk=72.96746444702148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.762 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.762 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:53:40 compute-0 nova_compute[186662]: 2026-02-19 19:53:40.861 186666 DEBUG nova.network.neutron [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Updating instance_info_cache with network_info: [{"id": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "address": "fa:16:3e:33:2a:86", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8e7eb67-d7", "ovs_interfaceid": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.367 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Releasing lock "refresh_cache-0b6d0709-7667-4eda-a3b1-ba3adf926203" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.367 186666 DEBUG nova.compute.manager [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Instance network_info: |[{"id": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "address": "fa:16:3e:33:2a:86", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8e7eb67-d7", "ovs_interfaceid": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.369 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Start _get_guest_xml network_info=[{"id": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "address": "fa:16:3e:33:2a:86", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8e7eb67-d7", "ovs_interfaceid": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.372 186666 WARNING nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.373 186666 DEBUG nova.virt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-206860921', uuid='0b6d0709-7667-4eda-a3b1-ba3adf926203'), owner=OwnerMeta(userid='6f190c8d209a43b19d4cba5936ab90e0', username='tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin', projectid='dafcf6090521473590cdb432a889739e', projectname='tempest-TestExecuteZoneMigrationStrategy-81034023'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "address": "fa:16:3e:33:2a:86", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8e7eb67-d7", "ovs_interfaceid": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771530821.37355) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.379 186666 DEBUG nova.virt.libvirt.host [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.379 186666 DEBUG nova.virt.libvirt.host [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.384 186666 DEBUG nova.virt.libvirt.host [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.385 186666 DEBUG nova.virt.libvirt.host [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.386 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.386 186666 DEBUG nova.virt.hardware [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:33Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.386 186666 DEBUG nova.virt.hardware [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.387 186666 DEBUG nova.virt.hardware [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.387 186666 DEBUG nova.virt.hardware [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.387 186666 DEBUG nova.virt.hardware [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.387 186666 DEBUG nova.virt.hardware [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.387 186666 DEBUG nova.virt.hardware [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.388 186666 DEBUG nova.virt.hardware [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.388 186666 DEBUG nova.virt.hardware [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.388 186666 DEBUG nova.virt.hardware [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.388 186666 DEBUG nova.virt.hardware [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.391 186666 DEBUG nova.virt.libvirt.vif [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:53:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-206860921',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-206860921',id=31,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dafcf6090521473590cdb432a889739e',ramdisk_id='',reservation_id='r-1rrmxr4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-81034023',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:53:35Z,user_data=None,user_id='6f190c8d209a43b19d4cba5936ab90e0',uuid=0b6d0709-7667-4eda-a3b1-ba3adf926203,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "address": "fa:16:3e:33:2a:86", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8e7eb67-d7", "ovs_interfaceid": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.392 186666 DEBUG nova.network.os_vif_util [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converting VIF {"id": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "address": "fa:16:3e:33:2a:86", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8e7eb67-d7", "ovs_interfaceid": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.392 186666 DEBUG nova.network.os_vif_util [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:2a:86,bridge_name='br-int',has_traffic_filtering=True,id=b8e7eb67-d798-4e84-b656-41c8f5d55d83,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8e7eb67-d7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.393 186666 DEBUG nova.objects.instance [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lazy-loading 'pci_devices' on Instance uuid 0b6d0709-7667-4eda-a3b1-ba3adf926203 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.641 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.808 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 0b6d0709-7667-4eda-a3b1-ba3adf926203 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.809 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.809 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:53:40 up  1:24,  0 user,  load average: 0.06, 0.10, 0.18\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_dafcf6090521473590cdb432a889739e': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.838 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.900 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:53:41 compute-0 nova_compute[186662]:   <uuid>0b6d0709-7667-4eda-a3b1-ba3adf926203</uuid>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   <name>instance-0000001f</name>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   <memory>131072</memory>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-206860921</nova:name>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:53:41</nova:creationTime>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:53:41 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:53:41 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:user uuid="6f190c8d209a43b19d4cba5936ab90e0">tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin</nova:user>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:project uuid="dafcf6090521473590cdb432a889739e">tempest-TestExecuteZoneMigrationStrategy-81034023</nova:project>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         <nova:port uuid="b8e7eb67-d798-4e84-b656-41c8f5d55d83">
Feb 19 19:53:41 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <system>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <entry name="serial">0b6d0709-7667-4eda-a3b1-ba3adf926203</entry>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <entry name="uuid">0b6d0709-7667-4eda-a3b1-ba3adf926203</entry>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     </system>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   <os>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   </os>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   <features>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   </features>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk.config"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:33:2a:86"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <target dev="tapb8e7eb67-d7"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/console.log" append="off"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <video>
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     </video>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:53:41 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:53:41 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:53:41 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:53:41 compute-0 nova_compute[186662]: </domain>
Feb 19 19:53:41 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.901 186666 DEBUG nova.compute.manager [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Preparing to wait for external event network-vif-plugged-b8e7eb67-d798-4e84-b656-41c8f5d55d83 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.901 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.901 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.901 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.902 186666 DEBUG nova.virt.libvirt.vif [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:53:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-206860921',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-206860921',id=31,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dafcf6090521473590cdb432a889739e',ramdisk_id='',reservation_id='r-1rrmxr4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-81034023',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:53:35Z,user_data=None,user_id='6f190c8d209a43b19d4cba5936ab90e0',uuid=0b6d0709-7667-4eda-a3b1-ba3adf926203,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "address": "fa:16:3e:33:2a:86", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8e7eb67-d7", "ovs_interfaceid": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.902 186666 DEBUG nova.network.os_vif_util [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converting VIF {"id": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "address": "fa:16:3e:33:2a:86", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8e7eb67-d7", "ovs_interfaceid": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.903 186666 DEBUG nova.network.os_vif_util [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:2a:86,bridge_name='br-int',has_traffic_filtering=True,id=b8e7eb67-d798-4e84-b656-41c8f5d55d83,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8e7eb67-d7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.903 186666 DEBUG os_vif [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:2a:86,bridge_name='br-int',has_traffic_filtering=True,id=b8e7eb67-d798-4e84-b656-41c8f5d55d83,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8e7eb67-d7') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.904 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.904 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.904 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.905 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.905 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '304f9b48-f848-570c-be06-5d4433a66b5b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.906 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.907 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.909 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.909 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8e7eb67-d7, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.909 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapb8e7eb67-d7, col_values=(('qos', UUID('1d21dc13-0edc-4073-a6bd-1d8bc06263ec')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.910 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapb8e7eb67-d7, col_values=(('external_ids', {'iface-id': 'b8e7eb67-d798-4e84-b656-41c8f5d55d83', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:33:2a:86', 'vm-uuid': '0b6d0709-7667-4eda-a3b1-ba3adf926203'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.911 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.911 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:41 compute-0 NetworkManager[56519]: <info>  [1771530821.9126] manager: (tapb8e7eb67-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.913 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.916 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:41 compute-0 nova_compute[186662]: 2026-02-19 19:53:41.917 186666 INFO os_vif [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:2a:86,bridge_name='br-int',has_traffic_filtering=True,id=b8e7eb67-d798-4e84-b656-41c8f5d55d83,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8e7eb67-d7')
Feb 19 19:53:42 compute-0 nova_compute[186662]: 2026-02-19 19:53:42.344 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:53:42 compute-0 nova_compute[186662]: 2026-02-19 19:53:42.861 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:53:42 compute-0 nova_compute[186662]: 2026-02-19 19:53:42.861 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:53:43 compute-0 nova_compute[186662]: 2026-02-19 19:53:43.457 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:53:43 compute-0 nova_compute[186662]: 2026-02-19 19:53:43.457 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:53:43 compute-0 nova_compute[186662]: 2026-02-19 19:53:43.458 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] No VIF found with MAC fa:16:3e:33:2a:86, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:53:43 compute-0 nova_compute[186662]: 2026-02-19 19:53:43.458 186666 INFO nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Using config drive
Feb 19 19:53:43 compute-0 nova_compute[186662]: 2026-02-19 19:53:43.967 186666 WARNING neutronclient.v2_0.client [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:53:44 compute-0 podman[218300]: 2026-02-19 19:53:44.311744137 +0000 UTC m=+0.085458247 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.398 186666 INFO nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Creating config drive at /var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk.config
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.403 186666 DEBUG oslo_concurrency.processutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpjq51u6n8 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.524 186666 DEBUG oslo_concurrency.processutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpjq51u6n8" returned: 0 in 0.122s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:53:44 compute-0 kernel: tapb8e7eb67-d7: entered promiscuous mode
Feb 19 19:53:44 compute-0 NetworkManager[56519]: <info>  [1771530824.5895] manager: (tapb8e7eb67-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Feb 19 19:53:44 compute-0 ovn_controller[96653]: 2026-02-19T19:53:44Z|00233|binding|INFO|Claiming lport b8e7eb67-d798-4e84-b656-41c8f5d55d83 for this chassis.
Feb 19 19:53:44 compute-0 ovn_controller[96653]: 2026-02-19T19:53:44Z|00234|binding|INFO|b8e7eb67-d798-4e84-b656-41c8f5d55d83: Claiming fa:16:3e:33:2a:86 10.100.0.14
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.591 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.595 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.615 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.619 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:2a:86 10.100.0.14'], port_security=['fa:16:3e:33:2a:86 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '0b6d0709-7667-4eda-a3b1-ba3adf926203', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dafcf6090521473590cdb432a889739e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8f4a8406-fbc3-4742-80fc-e9181ef138d4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ddef6f8-604e-4b41-9f83-1182f76298b6, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=b8e7eb67-d798-4e84-b656-41c8f5d55d83) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.619 105986 INFO neutron.agent.ovn.metadata.agent [-] Port b8e7eb67-d798-4e84-b656-41c8f5d55d83 in datapath 2cb93231-0e3e-4efd-8b0c-4366500dbd16 bound to our chassis
Feb 19 19:53:44 compute-0 systemd-udevd[218337]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.620 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2cb93231-0e3e-4efd-8b0c-4366500dbd16
Feb 19 19:53:44 compute-0 ovn_controller[96653]: 2026-02-19T19:53:44Z|00235|binding|INFO|Setting lport b8e7eb67-d798-4e84-b656-41c8f5d55d83 ovn-installed in OVS
Feb 19 19:53:44 compute-0 ovn_controller[96653]: 2026-02-19T19:53:44Z|00236|binding|INFO|Setting lport b8e7eb67-d798-4e84-b656-41c8f5d55d83 up in Southbound
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.623 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.629 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b598af1b-841c-4061-a324-b14b51aa2f66]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.629 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2cb93231-01 in ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.631 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2cb93231-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:53:44 compute-0 NetworkManager[56519]: <info>  [1771530824.6319] device (tapb8e7eb67-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.631 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[fb65bdd1-4ad4-49b8-9fa8-8bfe7ee3a4c2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.631 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[448f84e7-2cd6-4c3f-a43a-1292580b8feb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 NetworkManager[56519]: <info>  [1771530824.6325] device (tapb8e7eb67-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:53:44 compute-0 systemd-machined[156014]: New machine qemu-22-instance-0000001f.
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.640 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[d6d2277a-dc12-4b98-931f-12b47e9cf1c0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000001f.
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.656 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ab2665ce-5888-4194-a9c3-6aeddb876c32]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.680 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[c98f4f98-86b6-4d8a-b7d6-ebc38a39d000]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 systemd-udevd[218341]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.684 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[957a699b-3d8d-4af9-ab7e-6accd7817b07]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 NetworkManager[56519]: <info>  [1771530824.6865] manager: (tap2cb93231-00): new Veth device (/org/freedesktop/NetworkManager/Devices/90)
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.706 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[8f45a2e7-722c-46a2-b071-c474291bfa57]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.708 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[0bfc1bda-6be9-4eaf-acd5-2889e436c6dd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 NetworkManager[56519]: <info>  [1771530824.7247] device (tap2cb93231-00): carrier: link connected
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.733 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[54756358-4cd4-4e58-a067-0aa189b1fd7c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.747 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[6cfab89c-cc4e-475b-b175-0335117a0f77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2cb93231-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:50:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509428, 'reachable_time': 15664, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218371, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.763 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e1894704-d3dc-41fd-b150-17226b333aaf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feff:5071'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 509428, 'tstamp': 509428}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218372, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.777 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[71657de4-da24-4721-836e-72a127d2f3dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2cb93231-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:50:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509428, 'reachable_time': 15664, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218373, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.806 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[50616183-68ba-49c5-8c2c-14bf68ccd575]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.834 186666 DEBUG nova.compute.manager [req-bb15c3a8-64ed-4a9d-bc74-0697bfbc2820 req-7ae4b1c4-0d9e-486f-ba90-e0ba737e6f90 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Received event network-vif-plugged-b8e7eb67-d798-4e84-b656-41c8f5d55d83 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.834 186666 DEBUG oslo_concurrency.lockutils [req-bb15c3a8-64ed-4a9d-bc74-0697bfbc2820 req-7ae4b1c4-0d9e-486f-ba90-e0ba737e6f90 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.835 186666 DEBUG oslo_concurrency.lockutils [req-bb15c3a8-64ed-4a9d-bc74-0697bfbc2820 req-7ae4b1c4-0d9e-486f-ba90-e0ba737e6f90 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.835 186666 DEBUG oslo_concurrency.lockutils [req-bb15c3a8-64ed-4a9d-bc74-0697bfbc2820 req-7ae4b1c4-0d9e-486f-ba90-e0ba737e6f90 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.835 186666 DEBUG nova.compute.manager [req-bb15c3a8-64ed-4a9d-bc74-0697bfbc2820 req-7ae4b1c4-0d9e-486f-ba90-e0ba737e6f90 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Processing event network-vif-plugged-b8e7eb67-d798-4e84-b656-41c8f5d55d83 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.858 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[52f9d9d8-e5f3-449f-8b84-586d4a38e5d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.859 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cb93231-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.860 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.860 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2cb93231-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:53:44 compute-0 kernel: tap2cb93231-00: entered promiscuous mode
Feb 19 19:53:44 compute-0 NetworkManager[56519]: <info>  [1771530824.8627] manager: (tap2cb93231-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.861 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.864 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.866 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2cb93231-00, col_values=(('external_ids', {'iface-id': '4d5db50f-b83e-4965-bddf-4fd62a569e63'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.867 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:44 compute-0 ovn_controller[96653]: 2026-02-19T19:53:44Z|00237|binding|INFO|Releasing lport 4d5db50f-b83e-4965-bddf-4fd62a569e63 from this chassis (sb_readonly=0)
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.868 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.870 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[5653f60a-d8b4-4c38-9c1b-4a6ad17e05d8]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.871 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.871 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.871 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 2cb93231-0e3e-4efd-8b0c-4366500dbd16 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.871 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.872 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[f94494fc-2c09-44ec-b04c-2edb1ed7d07a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.872 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:53:44 compute-0 nova_compute[186662]: 2026-02-19 19:53:44.872 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.873 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[6bcc55a7-f28d-4a72-99dd-b862ec0cd925]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.874 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-2cb93231-0e3e-4efd-8b0c-4366500dbd16
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID 2cb93231-0e3e-4efd-8b0c-4366500dbd16
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:53:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:44.875 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'env', 'PROCESS_TAG=haproxy-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2cb93231-0e3e-4efd-8b0c-4366500dbd16.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:53:45 compute-0 nova_compute[186662]: 2026-02-19 19:53:45.191 186666 DEBUG nova.compute.manager [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:53:45 compute-0 nova_compute[186662]: 2026-02-19 19:53:45.194 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Feb 19 19:53:45 compute-0 nova_compute[186662]: 2026-02-19 19:53:45.198 186666 INFO nova.virt.libvirt.driver [-] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Instance spawned successfully.
Feb 19 19:53:45 compute-0 nova_compute[186662]: 2026-02-19 19:53:45.198 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Feb 19 19:53:45 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:53:45.222 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:53:45 compute-0 podman[218412]: 2026-02-19 19:53:45.237625255 +0000 UTC m=+0.050966829 container create d0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 19 19:53:45 compute-0 systemd[1]: Started libpod-conmon-d0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8.scope.
Feb 19 19:53:45 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:53:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/374a82fac12f8cc289c722a142405a2d534ea82f2d82c09821b97e39eeddd59f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:53:45 compute-0 podman[218412]: 2026-02-19 19:53:45.208819766 +0000 UTC m=+0.022161360 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:53:45 compute-0 podman[218412]: 2026-02-19 19:53:45.307114204 +0000 UTC m=+0.120455788 container init d0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 19 19:53:45 compute-0 podman[218412]: 2026-02-19 19:53:45.311316446 +0000 UTC m=+0.124658020 container start d0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:53:45 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[218427]: [NOTICE]   (218431) : New worker (218433) forked
Feb 19 19:53:45 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[218427]: [NOTICE]   (218431) : Loading success.
Feb 19 19:53:45 compute-0 nova_compute[186662]: 2026-02-19 19:53:45.355 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:53:45 compute-0 nova_compute[186662]: 2026-02-19 19:53:45.710 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:53:45 compute-0 nova_compute[186662]: 2026-02-19 19:53:45.711 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:53:45 compute-0 nova_compute[186662]: 2026-02-19 19:53:45.711 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:53:45 compute-0 nova_compute[186662]: 2026-02-19 19:53:45.711 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:53:45 compute-0 nova_compute[186662]: 2026-02-19 19:53:45.712 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:53:45 compute-0 nova_compute[186662]: 2026-02-19 19:53:45.712 186666 DEBUG nova.virt.libvirt.driver [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:53:46 compute-0 nova_compute[186662]: 2026-02-19 19:53:46.219 186666 INFO nova.compute.manager [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Took 10.02 seconds to spawn the instance on the hypervisor.
Feb 19 19:53:46 compute-0 nova_compute[186662]: 2026-02-19 19:53:46.220 186666 DEBUG nova.compute.manager [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:53:46 compute-0 nova_compute[186662]: 2026-02-19 19:53:46.643 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:46 compute-0 nova_compute[186662]: 2026-02-19 19:53:46.748 186666 INFO nova.compute.manager [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Took 15.26 seconds to build instance.
Feb 19 19:53:46 compute-0 nova_compute[186662]: 2026-02-19 19:53:46.878 186666 DEBUG nova.compute.manager [req-9be49b6b-451f-4441-b600-df9f82766512 req-a51893cc-763b-4ce9-bc7d-11cb1ca35e3c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Received event network-vif-plugged-b8e7eb67-d798-4e84-b656-41c8f5d55d83 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:53:46 compute-0 nova_compute[186662]: 2026-02-19 19:53:46.878 186666 DEBUG oslo_concurrency.lockutils [req-9be49b6b-451f-4441-b600-df9f82766512 req-a51893cc-763b-4ce9-bc7d-11cb1ca35e3c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:53:46 compute-0 nova_compute[186662]: 2026-02-19 19:53:46.879 186666 DEBUG oslo_concurrency.lockutils [req-9be49b6b-451f-4441-b600-df9f82766512 req-a51893cc-763b-4ce9-bc7d-11cb1ca35e3c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:53:46 compute-0 nova_compute[186662]: 2026-02-19 19:53:46.879 186666 DEBUG oslo_concurrency.lockutils [req-9be49b6b-451f-4441-b600-df9f82766512 req-a51893cc-763b-4ce9-bc7d-11cb1ca35e3c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:53:46 compute-0 nova_compute[186662]: 2026-02-19 19:53:46.879 186666 DEBUG nova.compute.manager [req-9be49b6b-451f-4441-b600-df9f82766512 req-a51893cc-763b-4ce9-bc7d-11cb1ca35e3c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] No waiting events found dispatching network-vif-plugged-b8e7eb67-d798-4e84-b656-41c8f5d55d83 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:53:46 compute-0 nova_compute[186662]: 2026-02-19 19:53:46.879 186666 WARNING nova.compute.manager [req-9be49b6b-451f-4441-b600-df9f82766512 req-a51893cc-763b-4ce9-bc7d-11cb1ca35e3c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Received unexpected event network-vif-plugged-b8e7eb67-d798-4e84-b656-41c8f5d55d83 for instance with vm_state active and task_state None.
Feb 19 19:53:46 compute-0 nova_compute[186662]: 2026-02-19 19:53:46.912 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:47 compute-0 nova_compute[186662]: 2026-02-19 19:53:47.253 186666 DEBUG oslo_concurrency.lockutils [None req-f962e4f2-df51-4ba9-8f03-134fbc628438 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.859s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:53:49 compute-0 podman[218442]: 2026-02-19 19:53:49.277439539 +0000 UTC m=+0.051554394 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, release=1770267347, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 19 19:53:51 compute-0 podman[218462]: 2026-02-19 19:53:51.283376651 +0000 UTC m=+0.059106837 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:53:51 compute-0 nova_compute[186662]: 2026-02-19 19:53:51.645 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:51 compute-0 nova_compute[186662]: 2026-02-19 19:53:51.914 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:55 compute-0 podman[218488]: 2026-02-19 19:53:55.259441935 +0000 UTC m=+0.040812742 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:53:56 compute-0 ovn_controller[96653]: 2026-02-19T19:53:56Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:33:2a:86 10.100.0.14
Feb 19 19:53:56 compute-0 ovn_controller[96653]: 2026-02-19T19:53:56Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:33:2a:86 10.100.0.14
Feb 19 19:53:56 compute-0 nova_compute[186662]: 2026-02-19 19:53:56.646 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:56 compute-0 nova_compute[186662]: 2026-02-19 19:53:56.916 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:53:59 compute-0 podman[196025]: time="2026-02-19T19:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:53:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:53:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2660 "" "Go-http-client/1.1"
Feb 19 19:54:01 compute-0 openstack_network_exporter[198916]: ERROR   19:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:54:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:54:01 compute-0 openstack_network_exporter[198916]: ERROR   19:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:54:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:54:01 compute-0 nova_compute[186662]: 2026-02-19 19:54:01.649 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:01 compute-0 nova_compute[186662]: 2026-02-19 19:54:01.918 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:02 compute-0 nova_compute[186662]: 2026-02-19 19:54:02.016 186666 DEBUG nova.virt.libvirt.driver [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Creating tmpfile /var/lib/nova/instances/tmpq20u8a7b to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Feb 19 19:54:02 compute-0 nova_compute[186662]: 2026-02-19 19:54:02.017 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:02 compute-0 nova_compute[186662]: 2026-02-19 19:54:02.027 186666 DEBUG nova.compute.manager [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq20u8a7b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Feb 19 19:54:04 compute-0 nova_compute[186662]: 2026-02-19 19:54:04.067 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:06 compute-0 nova_compute[186662]: 2026-02-19 19:54:06.651 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:06 compute-0 nova_compute[186662]: 2026-02-19 19:54:06.941 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:07 compute-0 nova_compute[186662]: 2026-02-19 19:54:07.983 186666 DEBUG nova.compute.manager [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq20u8a7b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Feb 19 19:54:08 compute-0 nova_compute[186662]: 2026-02-19 19:54:08.994 186666 DEBUG oslo_concurrency.lockutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:54:08 compute-0 nova_compute[186662]: 2026-02-19 19:54:08.995 186666 DEBUG oslo_concurrency.lockutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:54:08 compute-0 nova_compute[186662]: 2026-02-19 19:54:08.995 186666 DEBUG nova.network.neutron [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:54:09 compute-0 nova_compute[186662]: 2026-02-19 19:54:09.501 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:10 compute-0 nova_compute[186662]: 2026-02-19 19:54:10.643 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:10 compute-0 nova_compute[186662]: 2026-02-19 19:54:10.811 186666 DEBUG nova.network.neutron [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Updating instance_info_cache with network_info: [{"id": "298f3d97-567c-49be-9314-19ab2839b2b8", "address": "fa:16:3e:2e:74:88", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap298f3d97-56", "ovs_interfaceid": "298f3d97-567c-49be-9314-19ab2839b2b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.316 186666 DEBUG oslo_concurrency.lockutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.331 186666 DEBUG nova.virt.libvirt.driver [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq20u8a7b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.332 186666 DEBUG nova.virt.libvirt.driver [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Creating instance directory: /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.332 186666 DEBUG nova.virt.libvirt.driver [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Creating disk.info with the contents: {'/var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk': 'qcow2', '/var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.332 186666 DEBUG nova.virt.libvirt.driver [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.333 186666 DEBUG nova.objects.instance [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.655 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.840 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.843 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.844 186666 DEBUG oslo_concurrency.processutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.885 186666 DEBUG oslo_concurrency.processutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.886 186666 DEBUG oslo_concurrency.lockutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.886 186666 DEBUG oslo_concurrency.lockutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.886 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.889 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.889 186666 DEBUG oslo_concurrency.processutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.943 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.972 186666 DEBUG oslo_concurrency.processutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:54:11 compute-0 nova_compute[186662]: 2026-02-19 19:54:11.972 186666 DEBUG oslo_concurrency.processutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.001 186666 DEBUG oslo_concurrency.processutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.002 186666 DEBUG oslo_concurrency.lockutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.003 186666 DEBUG oslo_concurrency.processutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.050 186666 DEBUG oslo_concurrency.processutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.050 186666 DEBUG nova.virt.disk.api [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Checking if we can resize image /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.051 186666 DEBUG oslo_concurrency.processutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.091 186666 DEBUG oslo_concurrency.processutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.092 186666 DEBUG nova.virt.disk.api [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Cannot resize image /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.092 186666 DEBUG nova.objects.instance [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.598 186666 DEBUG nova.objects.base [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.598 186666 DEBUG oslo_concurrency.processutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.621 186666 DEBUG oslo_concurrency.processutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk.config 497664" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.622 186666 DEBUG nova.virt.libvirt.driver [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.624 186666 DEBUG nova.virt.libvirt.vif [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-02-19T19:53:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-670100340',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-670100340',id=30,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:53:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='dafcf6090521473590cdb432a889739e',ramdisk_id='',reservation_id='r-gdlb11w1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-81034023',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:53:26Z,user_data=None,user_id='6f190c8d209a43b19d4cba5936ab90e0',uuid=eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "298f3d97-567c-49be-9314-19ab2839b2b8", "address": "fa:16:3e:2e:74:88", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap298f3d97-56", "ovs_interfaceid": "298f3d97-567c-49be-9314-19ab2839b2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.624 186666 DEBUG nova.network.os_vif_util [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "298f3d97-567c-49be-9314-19ab2839b2b8", "address": "fa:16:3e:2e:74:88", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap298f3d97-56", "ovs_interfaceid": "298f3d97-567c-49be-9314-19ab2839b2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.625 186666 DEBUG nova.network.os_vif_util [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:74:88,bridge_name='br-int',has_traffic_filtering=True,id=298f3d97-567c-49be-9314-19ab2839b2b8,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap298f3d97-56') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.625 186666 DEBUG os_vif [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:74:88,bridge_name='br-int',has_traffic_filtering=True,id=298f3d97-567c-49be-9314-19ab2839b2b8,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap298f3d97-56') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.626 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.626 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.626 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.627 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.627 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '029f5703-308f-5250-a943-1ac91fb17bab', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.628 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.629 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.633 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.633 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap298f3d97-56, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.634 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap298f3d97-56, col_values=(('qos', UUID('d753be96-a3c1-4d66-add9-b84e403c722d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.634 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap298f3d97-56, col_values=(('external_ids', {'iface-id': '298f3d97-567c-49be-9314-19ab2839b2b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:74:88', 'vm-uuid': 'eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.635 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:12 compute-0 NetworkManager[56519]: <info>  [1771530852.6357] manager: (tap298f3d97-56): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.637 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.641 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.641 186666 INFO os_vif [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:74:88,bridge_name='br-int',has_traffic_filtering=True,id=298f3d97-567c-49be-9314-19ab2839b2b8,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap298f3d97-56')
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.642 186666 DEBUG nova.virt.libvirt.driver [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.642 186666 DEBUG nova.compute.manager [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq20u8a7b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Feb 19 19:54:12 compute-0 nova_compute[186662]: 2026-02-19 19:54:12.642 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:12 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 19 19:54:13 compute-0 nova_compute[186662]: 2026-02-19 19:54:13.377 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:14 compute-0 nova_compute[186662]: 2026-02-19 19:54:14.537 186666 DEBUG nova.network.neutron [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Port 298f3d97-567c-49be-9314-19ab2839b2b8 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Feb 19 19:54:14 compute-0 nova_compute[186662]: 2026-02-19 19:54:14.553 186666 DEBUG nova.compute.manager [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq20u8a7b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Feb 19 19:54:14 compute-0 ovn_controller[96653]: 2026-02-19T19:54:14Z|00238|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 19 19:54:15 compute-0 podman[218543]: 2026-02-19 19:54:15.289300884 +0000 UTC m=+0.066185607 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 19 19:54:16 compute-0 nova_compute[186662]: 2026-02-19 19:54:16.658 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:17 compute-0 nova_compute[186662]: 2026-02-19 19:54:17.637 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:17 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 19 19:54:17 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 19 19:54:18 compute-0 kernel: tap298f3d97-56: entered promiscuous mode
Feb 19 19:54:18 compute-0 NetworkManager[56519]: <info>  [1771530858.4283] manager: (tap298f3d97-56): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Feb 19 19:54:18 compute-0 systemd-udevd[218593]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:54:18 compute-0 nova_compute[186662]: 2026-02-19 19:54:18.473 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:18 compute-0 ovn_controller[96653]: 2026-02-19T19:54:18Z|00239|binding|INFO|Claiming lport 298f3d97-567c-49be-9314-19ab2839b2b8 for this additional chassis.
Feb 19 19:54:18 compute-0 ovn_controller[96653]: 2026-02-19T19:54:18Z|00240|binding|INFO|298f3d97-567c-49be-9314-19ab2839b2b8: Claiming fa:16:3e:2e:74:88 10.100.0.4
Feb 19 19:54:18 compute-0 nova_compute[186662]: 2026-02-19 19:54:18.477 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:18 compute-0 ovn_controller[96653]: 2026-02-19T19:54:18Z|00241|binding|INFO|Setting lport 298f3d97-567c-49be-9314-19ab2839b2b8 ovn-installed in OVS
Feb 19 19:54:18 compute-0 nova_compute[186662]: 2026-02-19 19:54:18.480 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:18 compute-0 NetworkManager[56519]: <info>  [1771530858.4854] device (tap298f3d97-56): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:54:18 compute-0 NetworkManager[56519]: <info>  [1771530858.4865] device (tap298f3d97-56): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.501 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:74:88 10.100.0.4'], port_security=['fa:16:3e:2e:74:88 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dafcf6090521473590cdb432a889739e', 'neutron:revision_number': '10', 'neutron:security_group_ids': '8f4a8406-fbc3-4742-80fc-e9181ef138d4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ddef6f8-604e-4b41-9f83-1182f76298b6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=298f3d97-567c-49be-9314-19ab2839b2b8) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.503 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 298f3d97-567c-49be-9314-19ab2839b2b8 in datapath 2cb93231-0e3e-4efd-8b0c-4366500dbd16 unbound from our chassis
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.504 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2cb93231-0e3e-4efd-8b0c-4366500dbd16
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.518 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[94b41941-47dc-462f-bd80-8a44eacb5725]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:18 compute-0 systemd-machined[156014]: New machine qemu-23-instance-0000001e.
Feb 19 19:54:18 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-0000001e.
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.543 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[b0fbda03-f5ad-42d9-beb1-8da25ac451d9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.545 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc0f523-856d-4b98-9067-ac47f2e64e32]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.568 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[ba721bbd-f9ca-45aa-a945-4da6685aa718]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.583 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff2109f-d277-44d9-ac38-8996da830dc2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2cb93231-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:50:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509428, 'reachable_time': 15664, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218609, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.597 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[06b5721e-1bbd-4a7c-bb50-f72e2e28d4dc]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2cb93231-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 509438, 'tstamp': 509438}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218611, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2cb93231-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 509441, 'tstamp': 509441}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218611, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.598 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cb93231-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:18 compute-0 nova_compute[186662]: 2026-02-19 19:54:18.600 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:18 compute-0 nova_compute[186662]: 2026-02-19 19:54:18.601 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.601 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2cb93231-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.601 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.602 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2cb93231-00, col_values=(('external_ids', {'iface-id': '4d5db50f-b83e-4965-bddf-4fd62a569e63'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.602 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:54:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:18.604 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d4d862a7-f84f-49cb-ab28-07e3ea849178]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-2cb93231-0e3e-4efd-8b0c-4366500dbd16\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 2cb93231-0e3e-4efd-8b0c-4366500dbd16\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:20 compute-0 podman[218633]: 2026-02-19 19:54:20.298122383 +0000 UTC m=+0.069983929 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., version=9.7, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Feb 19 19:54:20 compute-0 ovn_controller[96653]: 2026-02-19T19:54:20Z|00242|binding|INFO|Claiming lport 298f3d97-567c-49be-9314-19ab2839b2b8 for this chassis.
Feb 19 19:54:20 compute-0 ovn_controller[96653]: 2026-02-19T19:54:20Z|00243|binding|INFO|298f3d97-567c-49be-9314-19ab2839b2b8: Claiming fa:16:3e:2e:74:88 10.100.0.4
Feb 19 19:54:20 compute-0 ovn_controller[96653]: 2026-02-19T19:54:20Z|00244|binding|INFO|Setting lport 298f3d97-567c-49be-9314-19ab2839b2b8 up in Southbound
Feb 19 19:54:21 compute-0 nova_compute[186662]: 2026-02-19 19:54:21.660 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:22 compute-0 nova_compute[186662]: 2026-02-19 19:54:22.010 186666 INFO nova.compute.manager [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Post operation of migration started
Feb 19 19:54:22 compute-0 nova_compute[186662]: 2026-02-19 19:54:22.011 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:22 compute-0 podman[218656]: 2026-02-19 19:54:22.294470796 +0000 UTC m=+0.071005734 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 19 19:54:22 compute-0 nova_compute[186662]: 2026-02-19 19:54:22.378 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:22 compute-0 nova_compute[186662]: 2026-02-19 19:54:22.378 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:22 compute-0 nova_compute[186662]: 2026-02-19 19:54:22.478 186666 DEBUG oslo_concurrency.lockutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:54:22 compute-0 nova_compute[186662]: 2026-02-19 19:54:22.479 186666 DEBUG oslo_concurrency.lockutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:54:22 compute-0 nova_compute[186662]: 2026-02-19 19:54:22.479 186666 DEBUG nova.network.neutron [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:54:22 compute-0 nova_compute[186662]: 2026-02-19 19:54:22.639 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:22 compute-0 nova_compute[186662]: 2026-02-19 19:54:22.986 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:23 compute-0 nova_compute[186662]: 2026-02-19 19:54:23.548 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:23 compute-0 nova_compute[186662]: 2026-02-19 19:54:23.716 186666 DEBUG nova.network.neutron [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Updating instance_info_cache with network_info: [{"id": "298f3d97-567c-49be-9314-19ab2839b2b8", "address": "fa:16:3e:2e:74:88", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap298f3d97-56", "ovs_interfaceid": "298f3d97-567c-49be-9314-19ab2839b2b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:54:24 compute-0 nova_compute[186662]: 2026-02-19 19:54:24.234 186666 DEBUG oslo_concurrency.lockutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:54:24 compute-0 nova_compute[186662]: 2026-02-19 19:54:24.759 186666 DEBUG oslo_concurrency.lockutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:24 compute-0 nova_compute[186662]: 2026-02-19 19:54:24.760 186666 DEBUG oslo_concurrency.lockutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:24 compute-0 nova_compute[186662]: 2026-02-19 19:54:24.760 186666 DEBUG oslo_concurrency.lockutils [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:24 compute-0 nova_compute[186662]: 2026-02-19 19:54:24.764 186666 INFO nova.virt.libvirt.driver [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 19 19:54:24 compute-0 virtqemud[186157]: Domain id=23 name='instance-0000001e' uuid=eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d is tainted: custom-monitor
Feb 19 19:54:25 compute-0 nova_compute[186662]: 2026-02-19 19:54:25.770 186666 INFO nova.virt.libvirt.driver [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 19 19:54:25 compute-0 podman[218682]: 2026-02-19 19:54:25.833580961 +0000 UTC m=+0.044066141 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:54:26 compute-0 nova_compute[186662]: 2026-02-19 19:54:26.662 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:26 compute-0 nova_compute[186662]: 2026-02-19 19:54:26.774 186666 INFO nova.virt.libvirt.driver [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 19 19:54:26 compute-0 nova_compute[186662]: 2026-02-19 19:54:26.778 186666 DEBUG nova.compute.manager [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:54:27 compute-0 nova_compute[186662]: 2026-02-19 19:54:27.287 186666 DEBUG nova.objects.instance [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Feb 19 19:54:27 compute-0 nova_compute[186662]: 2026-02-19 19:54:27.641 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:28 compute-0 nova_compute[186662]: 2026-02-19 19:54:28.303 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:29 compute-0 nova_compute[186662]: 2026-02-19 19:54:29.375 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:29 compute-0 nova_compute[186662]: 2026-02-19 19:54:29.376 186666 WARNING neutronclient.v2_0.client [None req-824eef03-f288-4d4f-85e2-dd2ea78b6c44 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:29 compute-0 podman[196025]: time="2026-02-19T19:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:54:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:54:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2665 "" "Go-http-client/1.1"
Feb 19 19:54:31 compute-0 openstack_network_exporter[198916]: ERROR   19:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:54:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:54:31 compute-0 openstack_network_exporter[198916]: ERROR   19:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:54:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:54:31 compute-0 nova_compute[186662]: 2026-02-19 19:54:31.664 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:32.165 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:32.166 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:32.167 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:32 compute-0 nova_compute[186662]: 2026-02-19 19:54:32.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:54:32 compute-0 nova_compute[186662]: 2026-02-19 19:54:32.644 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:34 compute-0 nova_compute[186662]: 2026-02-19 19:54:34.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:54:35 compute-0 nova_compute[186662]: 2026-02-19 19:54:35.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:54:36 compute-0 nova_compute[186662]: 2026-02-19 19:54:36.310 186666 DEBUG oslo_concurrency.lockutils [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "0b6d0709-7667-4eda-a3b1-ba3adf926203" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:36 compute-0 nova_compute[186662]: 2026-02-19 19:54:36.311 186666 DEBUG oslo_concurrency.lockutils [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:36 compute-0 nova_compute[186662]: 2026-02-19 19:54:36.311 186666 DEBUG oslo_concurrency.lockutils [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:36 compute-0 nova_compute[186662]: 2026-02-19 19:54:36.312 186666 DEBUG oslo_concurrency.lockutils [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:36 compute-0 nova_compute[186662]: 2026-02-19 19:54:36.312 186666 DEBUG oslo_concurrency.lockutils [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:36 compute-0 nova_compute[186662]: 2026-02-19 19:54:36.323 186666 INFO nova.compute.manager [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Terminating instance
Feb 19 19:54:36 compute-0 sshd-session[218712]: Invalid user ubuntu from 103.67.78.251 port 54344
Feb 19 19:54:36 compute-0 nova_compute[186662]: 2026-02-19 19:54:36.714 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:36 compute-0 nova_compute[186662]: 2026-02-19 19:54:36.838 186666 DEBUG nova.compute.manager [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:54:36 compute-0 kernel: tapb8e7eb67-d7 (unregistering): left promiscuous mode
Feb 19 19:54:36 compute-0 NetworkManager[56519]: <info>  [1771530876.8592] device (tapb8e7eb67-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:54:36 compute-0 ovn_controller[96653]: 2026-02-19T19:54:36Z|00245|binding|INFO|Releasing lport b8e7eb67-d798-4e84-b656-41c8f5d55d83 from this chassis (sb_readonly=0)
Feb 19 19:54:36 compute-0 ovn_controller[96653]: 2026-02-19T19:54:36Z|00246|binding|INFO|Setting lport b8e7eb67-d798-4e84-b656-41c8f5d55d83 down in Southbound
Feb 19 19:54:36 compute-0 ovn_controller[96653]: 2026-02-19T19:54:36Z|00247|binding|INFO|Removing iface tapb8e7eb67-d7 ovn-installed in OVS
Feb 19 19:54:36 compute-0 nova_compute[186662]: 2026-02-19 19:54:36.865 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:36 compute-0 nova_compute[186662]: 2026-02-19 19:54:36.867 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:36 compute-0 nova_compute[186662]: 2026-02-19 19:54:36.871 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.873 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:2a:86 10.100.0.14'], port_security=['fa:16:3e:33:2a:86 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '0b6d0709-7667-4eda-a3b1-ba3adf926203', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dafcf6090521473590cdb432a889739e', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8f4a8406-fbc3-4742-80fc-e9181ef138d4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ddef6f8-604e-4b41-9f83-1182f76298b6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=b8e7eb67-d798-4e84-b656-41c8f5d55d83) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.874 105986 INFO neutron.agent.ovn.metadata.agent [-] Port b8e7eb67-d798-4e84-b656-41c8f5d55d83 in datapath 2cb93231-0e3e-4efd-8b0c-4366500dbd16 unbound from our chassis
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.876 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2cb93231-0e3e-4efd-8b0c-4366500dbd16
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.886 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a02cd5-7780-4cab-a512-ae7c0401f7f4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.904 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[7019c4ca-0b58-4732-8e2c-50e5d57b298c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.906 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb45c19-3a43-443b-8e22-a930e07f1604]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:36 compute-0 sshd-session[218712]: Received disconnect from 103.67.78.251 port 54344:11: Bye Bye [preauth]
Feb 19 19:54:36 compute-0 sshd-session[218712]: Disconnected from invalid user ubuntu 103.67.78.251 port 54344 [preauth]
Feb 19 19:54:36 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Feb 19 19:54:36 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001f.scope: Consumed 13.324s CPU time.
Feb 19 19:54:36 compute-0 systemd-machined[156014]: Machine qemu-22-instance-0000001f terminated.
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.924 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[431583d5-444e-4d53-9be2-a8ca08f74d6b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.935 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[df0d0285-b462-4f44-ad6b-5b84c6476180]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2cb93231-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:50:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509428, 'reachable_time': 15664, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218726, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.947 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e315dc1e-b85f-4d06-ad69-aa038f65564d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2cb93231-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 509438, 'tstamp': 509438}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218727, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2cb93231-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 509441, 'tstamp': 509441}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218727, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.948 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cb93231-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:36 compute-0 nova_compute[186662]: 2026-02-19 19:54:36.950 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.954 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2cb93231-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:36 compute-0 nova_compute[186662]: 2026-02-19 19:54:36.953 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.954 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.954 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2cb93231-00, col_values=(('external_ids', {'iface-id': '4d5db50f-b83e-4965-bddf-4fd62a569e63'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.954 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:54:36 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:36.955 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[adad9a8a-d342-4663-9d47-5e595a8c28df]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-2cb93231-0e3e-4efd-8b0c-4366500dbd16\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 2cb93231-0e3e-4efd-8b0c-4366500dbd16\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.086 186666 INFO nova.virt.libvirt.driver [-] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Instance destroyed successfully.
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.086 186666 DEBUG nova.objects.instance [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lazy-loading 'resources' on Instance uuid 0b6d0709-7667-4eda-a3b1-ba3adf926203 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.436 186666 DEBUG nova.compute.manager [req-11d72c52-25db-4525-ba80-bff60ae5b13a req-f4464cb8-23d0-40b8-8557-ec71f3862c9c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Received event network-vif-unplugged-b8e7eb67-d798-4e84-b656-41c8f5d55d83 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.436 186666 DEBUG oslo_concurrency.lockutils [req-11d72c52-25db-4525-ba80-bff60ae5b13a req-f4464cb8-23d0-40b8-8557-ec71f3862c9c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.436 186666 DEBUG oslo_concurrency.lockutils [req-11d72c52-25db-4525-ba80-bff60ae5b13a req-f4464cb8-23d0-40b8-8557-ec71f3862c9c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.437 186666 DEBUG oslo_concurrency.lockutils [req-11d72c52-25db-4525-ba80-bff60ae5b13a req-f4464cb8-23d0-40b8-8557-ec71f3862c9c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.437 186666 DEBUG nova.compute.manager [req-11d72c52-25db-4525-ba80-bff60ae5b13a req-f4464cb8-23d0-40b8-8557-ec71f3862c9c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] No waiting events found dispatching network-vif-unplugged-b8e7eb67-d798-4e84-b656-41c8f5d55d83 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.437 186666 DEBUG nova.compute.manager [req-11d72c52-25db-4525-ba80-bff60ae5b13a req-f4464cb8-23d0-40b8-8557-ec71f3862c9c 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Received event network-vif-unplugged-b8e7eb67-d798-4e84-b656-41c8f5d55d83 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:54:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:37.519 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.520 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:37 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:37.520 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.597 186666 DEBUG nova.virt.libvirt.vif [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:53:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-206860921',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-206860921',id=31,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:53:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dafcf6090521473590cdb432a889739e',ramdisk_id='',reservation_id='r-1rrmxr4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-81034023',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:53:46Z,user_data=None,user_id='6f190c8d209a43b19d4cba5936ab90e0',uuid=0b6d0709-7667-4eda-a3b1-ba3adf926203,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "address": "fa:16:3e:33:2a:86", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8e7eb67-d7", "ovs_interfaceid": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.598 186666 DEBUG nova.network.os_vif_util [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converting VIF {"id": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "address": "fa:16:3e:33:2a:86", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb8e7eb67-d7", "ovs_interfaceid": "b8e7eb67-d798-4e84-b656-41c8f5d55d83", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.598 186666 DEBUG nova.network.os_vif_util [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:2a:86,bridge_name='br-int',has_traffic_filtering=True,id=b8e7eb67-d798-4e84-b656-41c8f5d55d83,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8e7eb67-d7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.599 186666 DEBUG os_vif [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:2a:86,bridge_name='br-int',has_traffic_filtering=True,id=b8e7eb67-d798-4e84-b656-41c8f5d55d83,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8e7eb67-d7') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.600 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.600 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8e7eb67-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.601 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.603 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.604 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.604 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=1d21dc13-0edc-4073-a6bd-1d8bc06263ec) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.604 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.606 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.608 186666 INFO os_vif [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:2a:86,bridge_name='br-int',has_traffic_filtering=True,id=b8e7eb67-d798-4e84-b656-41c8f5d55d83,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb8e7eb67-d7')
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.608 186666 INFO nova.virt.libvirt.driver [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Deleting instance files /var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203_del
Feb 19 19:54:37 compute-0 nova_compute[186662]: 2026-02-19 19:54:37.609 186666 INFO nova.virt.libvirt.driver [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Deletion of /var/lib/nova/instances/0b6d0709-7667-4eda-a3b1-ba3adf926203_del complete
Feb 19 19:54:38 compute-0 nova_compute[186662]: 2026-02-19 19:54:38.120 186666 INFO nova.compute.manager [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Took 1.28 seconds to destroy the instance on the hypervisor.
Feb 19 19:54:38 compute-0 nova_compute[186662]: 2026-02-19 19:54:38.121 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:54:38 compute-0 nova_compute[186662]: 2026-02-19 19:54:38.121 186666 DEBUG nova.compute.manager [-] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:54:38 compute-0 nova_compute[186662]: 2026-02-19 19:54:38.121 186666 DEBUG nova.network.neutron [-] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:54:38 compute-0 nova_compute[186662]: 2026-02-19 19:54:38.122 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:38 compute-0 nova_compute[186662]: 2026-02-19 19:54:38.373 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:39 compute-0 nova_compute[186662]: 2026-02-19 19:54:39.494 186666 DEBUG nova.compute.manager [req-b58a5d13-fb8b-468f-8d8a-012f166d10e6 req-ba03ae8f-ba2b-4843-af0f-9fa7fcb75493 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Received event network-vif-unplugged-b8e7eb67-d798-4e84-b656-41c8f5d55d83 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:54:39 compute-0 nova_compute[186662]: 2026-02-19 19:54:39.495 186666 DEBUG oslo_concurrency.lockutils [req-b58a5d13-fb8b-468f-8d8a-012f166d10e6 req-ba03ae8f-ba2b-4843-af0f-9fa7fcb75493 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:39 compute-0 nova_compute[186662]: 2026-02-19 19:54:39.495 186666 DEBUG oslo_concurrency.lockutils [req-b58a5d13-fb8b-468f-8d8a-012f166d10e6 req-ba03ae8f-ba2b-4843-af0f-9fa7fcb75493 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:39 compute-0 nova_compute[186662]: 2026-02-19 19:54:39.496 186666 DEBUG oslo_concurrency.lockutils [req-b58a5d13-fb8b-468f-8d8a-012f166d10e6 req-ba03ae8f-ba2b-4843-af0f-9fa7fcb75493 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:39 compute-0 nova_compute[186662]: 2026-02-19 19:54:39.496 186666 DEBUG nova.compute.manager [req-b58a5d13-fb8b-468f-8d8a-012f166d10e6 req-ba03ae8f-ba2b-4843-af0f-9fa7fcb75493 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] No waiting events found dispatching network-vif-unplugged-b8e7eb67-d798-4e84-b656-41c8f5d55d83 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:54:39 compute-0 nova_compute[186662]: 2026-02-19 19:54:39.497 186666 DEBUG nova.compute.manager [req-b58a5d13-fb8b-468f-8d8a-012f166d10e6 req-ba03ae8f-ba2b-4843-af0f-9fa7fcb75493 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Received event network-vif-unplugged-b8e7eb67-d798-4e84-b656-41c8f5d55d83 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:54:39 compute-0 nova_compute[186662]: 2026-02-19 19:54:39.497 186666 DEBUG nova.compute.manager [req-b58a5d13-fb8b-468f-8d8a-012f166d10e6 req-ba03ae8f-ba2b-4843-af0f-9fa7fcb75493 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Received event network-vif-deleted-b8e7eb67-d798-4e84-b656-41c8f5d55d83 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:54:39 compute-0 nova_compute[186662]: 2026-02-19 19:54:39.497 186666 INFO nova.compute.manager [req-b58a5d13-fb8b-468f-8d8a-012f166d10e6 req-ba03ae8f-ba2b-4843-af0f-9fa7fcb75493 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Neutron deleted interface b8e7eb67-d798-4e84-b656-41c8f5d55d83; detaching it from the instance and deleting it from the info cache
Feb 19 19:54:39 compute-0 nova_compute[186662]: 2026-02-19 19:54:39.498 186666 DEBUG nova.network.neutron [req-b58a5d13-fb8b-468f-8d8a-012f166d10e6 req-ba03ae8f-ba2b-4843-af0f-9fa7fcb75493 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:54:39 compute-0 nova_compute[186662]: 2026-02-19 19:54:39.585 186666 DEBUG nova.network.neutron [-] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.008 186666 DEBUG nova.compute.manager [req-b58a5d13-fb8b-468f-8d8a-012f166d10e6 req-ba03ae8f-ba2b-4843-af0f-9fa7fcb75493 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Detach interface failed, port_id=b8e7eb67-d798-4e84-b656-41c8f5d55d83, reason: Instance 0b6d0709-7667-4eda-a3b1-ba3adf926203 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.096 186666 INFO nova.compute.manager [-] [instance: 0b6d0709-7667-4eda-a3b1-ba3adf926203] Took 1.98 seconds to deallocate network for instance.
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.620 186666 DEBUG oslo_concurrency.lockutils [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.620 186666 DEBUG oslo_concurrency.lockutils [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.642 186666 DEBUG nova.scheduler.client.report [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Refreshing inventories for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.656 186666 DEBUG nova.scheduler.client.report [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Updating ProviderTree inventory for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.656 186666 DEBUG nova.compute.provider_tree [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.667 186666 DEBUG nova.scheduler.client.report [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Refreshing aggregate associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.682 186666 DEBUG nova.scheduler.client.report [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Refreshing trait associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_SSE2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,HW_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_USB _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Feb 19 19:54:40 compute-0 nova_compute[186662]: 2026-02-19 19:54:40.721 186666 DEBUG nova.compute.provider_tree [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:54:41 compute-0 nova_compute[186662]: 2026-02-19 19:54:41.268 186666 DEBUG nova.scheduler.client.report [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:54:41 compute-0 nova_compute[186662]: 2026-02-19 19:54:41.271 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:41 compute-0 nova_compute[186662]: 2026-02-19 19:54:41.715 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:41 compute-0 nova_compute[186662]: 2026-02-19 19:54:41.777 186666 DEBUG oslo_concurrency.lockutils [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.157s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:41 compute-0 nova_compute[186662]: 2026-02-19 19:54:41.779 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.508s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:41 compute-0 nova_compute[186662]: 2026-02-19 19:54:41.779 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:41 compute-0 nova_compute[186662]: 2026-02-19 19:54:41.780 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:54:41 compute-0 nova_compute[186662]: 2026-02-19 19:54:41.797 186666 INFO nova.scheduler.client.report [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Deleted allocations for instance 0b6d0709-7667-4eda-a3b1-ba3adf926203
Feb 19 19:54:42 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:42.522 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:42 compute-0 nova_compute[186662]: 2026-02-19 19:54:42.605 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:42 compute-0 nova_compute[186662]: 2026-02-19 19:54:42.816 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:54:42 compute-0 nova_compute[186662]: 2026-02-19 19:54:42.828 186666 DEBUG oslo_concurrency.lockutils [None req-ff4c7ff6-fdf1-411f-aed6-b6b9a61bfa85 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "0b6d0709-7667-4eda-a3b1-ba3adf926203" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.518s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:42 compute-0 nova_compute[186662]: 2026-02-19 19:54:42.860 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:54:42 compute-0 nova_compute[186662]: 2026-02-19 19:54:42.860 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:54:42 compute-0 nova_compute[186662]: 2026-02-19 19:54:42.902 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:54:43 compute-0 nova_compute[186662]: 2026-02-19 19:54:43.015 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:54:43 compute-0 nova_compute[186662]: 2026-02-19 19:54:43.016 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:54:43 compute-0 nova_compute[186662]: 2026-02-19 19:54:43.028 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:54:43 compute-0 nova_compute[186662]: 2026-02-19 19:54:43.028 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5661MB free_disk=72.93882369995117GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:54:43 compute-0 nova_compute[186662]: 2026-02-19 19:54:43.029 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:43 compute-0 nova_compute[186662]: 2026-02-19 19:54:43.029 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:43 compute-0 nova_compute[186662]: 2026-02-19 19:54:43.538 186666 DEBUG oslo_concurrency.lockutils [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:43 compute-0 nova_compute[186662]: 2026-02-19 19:54:43.539 186666 DEBUG oslo_concurrency.lockutils [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:43 compute-0 nova_compute[186662]: 2026-02-19 19:54:43.539 186666 DEBUG oslo_concurrency.lockutils [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:43 compute-0 nova_compute[186662]: 2026-02-19 19:54:43.539 186666 DEBUG oslo_concurrency.lockutils [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:43 compute-0 nova_compute[186662]: 2026-02-19 19:54:43.540 186666 DEBUG oslo_concurrency.lockutils [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:43 compute-0 nova_compute[186662]: 2026-02-19 19:54:43.555 186666 INFO nova.compute.manager [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Terminating instance
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.069 186666 DEBUG nova.compute.manager [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:54:44 compute-0 kernel: tap298f3d97-56 (unregistering): left promiscuous mode
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.098 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:44 compute-0 NetworkManager[56519]: <info>  [1771530884.0986] device (tap298f3d97-56): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:54:44 compute-0 ovn_controller[96653]: 2026-02-19T19:54:44Z|00248|binding|INFO|Releasing lport 298f3d97-567c-49be-9314-19ab2839b2b8 from this chassis (sb_readonly=0)
Feb 19 19:54:44 compute-0 ovn_controller[96653]: 2026-02-19T19:54:44Z|00249|binding|INFO|Setting lport 298f3d97-567c-49be-9314-19ab2839b2b8 down in Southbound
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.102 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:44 compute-0 ovn_controller[96653]: 2026-02-19T19:54:44Z|00250|binding|INFO|Removing iface tap298f3d97-56 ovn-installed in OVS
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.104 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.107 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.114 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:74:88 10.100.0.4'], port_security=['fa:16:3e:2e:74:88 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dafcf6090521473590cdb432a889739e', 'neutron:revision_number': '15', 'neutron:security_group_ids': '8f4a8406-fbc3-4742-80fc-e9181ef138d4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ddef6f8-604e-4b41-9f83-1182f76298b6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=298f3d97-567c-49be-9314-19ab2839b2b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.115 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 298f3d97-567c-49be-9314-19ab2839b2b8 in datapath 2cb93231-0e3e-4efd-8b0c-4366500dbd16 unbound from our chassis
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.116 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2cb93231-0e3e-4efd-8b0c-4366500dbd16, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.117 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[91927c74-7920-4bd1-b00d-414a8bcdf911]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.117 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16 namespace which is not needed anymore
Feb 19 19:54:44 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Feb 19 19:54:44 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001e.scope: Consumed 1.953s CPU time.
Feb 19 19:54:44 compute-0 systemd-machined[156014]: Machine qemu-23-instance-0000001e terminated.
Feb 19 19:54:44 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[218427]: [NOTICE]   (218431) : haproxy version is 3.0.5-8e879a5
Feb 19 19:54:44 compute-0 podman[218778]: 2026-02-19 19:54:44.210104196 +0000 UTC m=+0.022270991 container kill d0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 19 19:54:44 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[218427]: [NOTICE]   (218431) : path to executable is /usr/sbin/haproxy
Feb 19 19:54:44 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[218427]: [WARNING]  (218431) : Exiting Master process...
Feb 19 19:54:44 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[218427]: [ALERT]    (218431) : Current worker (218433) exited with code 143 (Terminated)
Feb 19 19:54:44 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[218427]: [WARNING]  (218431) : All workers exited. Exiting... (0)
Feb 19 19:54:44 compute-0 systemd[1]: libpod-d0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8.scope: Deactivated successfully.
Feb 19 19:54:44 compute-0 podman[218793]: 2026-02-19 19:54:44.246253853 +0000 UTC m=+0.021049591 container died d0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Feb 19 19:54:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8-userdata-shm.mount: Deactivated successfully.
Feb 19 19:54:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-374a82fac12f8cc289c722a142405a2d534ea82f2d82c09821b97e39eeddd59f-merged.mount: Deactivated successfully.
Feb 19 19:54:44 compute-0 podman[218793]: 2026-02-19 19:54:44.274125769 +0000 UTC m=+0.048921487 container cleanup d0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 19 19:54:44 compute-0 systemd[1]: libpod-conmon-d0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8.scope: Deactivated successfully.
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.280 186666 DEBUG nova.compute.manager [req-6585884d-6a46-418b-9902-2f9d2310b4b0 req-2eacb9dc-8854-41ac-a13b-d041a345b2b6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Received event network-vif-unplugged-298f3d97-567c-49be-9314-19ab2839b2b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.280 186666 DEBUG oslo_concurrency.lockutils [req-6585884d-6a46-418b-9902-2f9d2310b4b0 req-2eacb9dc-8854-41ac-a13b-d041a345b2b6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.281 186666 DEBUG oslo_concurrency.lockutils [req-6585884d-6a46-418b-9902-2f9d2310b4b0 req-2eacb9dc-8854-41ac-a13b-d041a345b2b6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.281 186666 DEBUG oslo_concurrency.lockutils [req-6585884d-6a46-418b-9902-2f9d2310b4b0 req-2eacb9dc-8854-41ac-a13b-d041a345b2b6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.281 186666 DEBUG nova.compute.manager [req-6585884d-6a46-418b-9902-2f9d2310b4b0 req-2eacb9dc-8854-41ac-a13b-d041a345b2b6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] No waiting events found dispatching network-vif-unplugged-298f3d97-567c-49be-9314-19ab2839b2b8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.281 186666 DEBUG nova.compute.manager [req-6585884d-6a46-418b-9902-2f9d2310b4b0 req-2eacb9dc-8854-41ac-a13b-d041a345b2b6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Received event network-vif-unplugged-298f3d97-567c-49be-9314-19ab2839b2b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.286 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.289 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:44 compute-0 podman[218795]: 2026-02-19 19:54:44.290770913 +0000 UTC m=+0.058970651 container remove d0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.294 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[639dc320-446b-45b3-9064-89a3d312981e]: (4, ("Thu Feb 19 07:54:44 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16 (d0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8)\nd0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8\nThu Feb 19 07:54:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16 (d0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8)\nd0534c9681dd8baf42f07c96771b729f8a0b621106655aadbd99d5b48a53a2b8\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.296 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[29ea097c-f794-44f3-898b-ba8edf287eda]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.296 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.296 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[1d751395-9641-48a3-a614-74117c6ae8af]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.297 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cb93231-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.298 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:44 compute-0 kernel: tap2cb93231-00: left promiscuous mode
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.309 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.311 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[90d7438f-a68f-41d1-93f5-81207ec1c2e6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.320 186666 INFO nova.virt.libvirt.driver [-] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Instance destroyed successfully.
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.321 186666 DEBUG nova.objects.instance [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lazy-loading 'resources' on Instance uuid eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.325 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[5f4ec793-fc3d-416f-8580-407f33427b92]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.326 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[0e57a11d-599a-498f-adca-2a06b95f2bed]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.337 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d1f168-bd07-49c8-8cef-fac43afc8b9e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509423, 'reachable_time': 38943, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218844, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d2cb93231\x2d0e3e\x2d4efd\x2d8b0c\x2d4366500dbd16.mount: Deactivated successfully.
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.340 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:54:44 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:54:44.340 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[3db9f61f-b7ba-4afb-937d-bbef52d7972f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.576 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.576 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.576 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:54:43 up  1:25,  0 user,  load average: 0.12, 0.12, 0.18\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_dafcf6090521473590cdb432a889739e': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.608 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.827 186666 DEBUG nova.virt.libvirt.vif [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-02-19T19:53:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-670100340',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-670100340',id=30,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:53:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dafcf6090521473590cdb432a889739e',ramdisk_id='',reservation_id='r-gdlb11w1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-81034023',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:54:27Z,user_data=None,user_id='6f190c8d209a43b19d4cba5936ab90e0',uuid=eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "298f3d97-567c-49be-9314-19ab2839b2b8", "address": "fa:16:3e:2e:74:88", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap298f3d97-56", "ovs_interfaceid": "298f3d97-567c-49be-9314-19ab2839b2b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.828 186666 DEBUG nova.network.os_vif_util [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converting VIF {"id": "298f3d97-567c-49be-9314-19ab2839b2b8", "address": "fa:16:3e:2e:74:88", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap298f3d97-56", "ovs_interfaceid": "298f3d97-567c-49be-9314-19ab2839b2b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.828 186666 DEBUG nova.network.os_vif_util [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2e:74:88,bridge_name='br-int',has_traffic_filtering=True,id=298f3d97-567c-49be-9314-19ab2839b2b8,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap298f3d97-56') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.829 186666 DEBUG os_vif [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:74:88,bridge_name='br-int',has_traffic_filtering=True,id=298f3d97-567c-49be-9314-19ab2839b2b8,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap298f3d97-56') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.830 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.830 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap298f3d97-56, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.832 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.834 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.835 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.835 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d753be96-a3c1-4d66-add9-b84e403c722d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.835 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.836 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.838 186666 INFO os_vif [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:74:88,bridge_name='br-int',has_traffic_filtering=True,id=298f3d97-567c-49be-9314-19ab2839b2b8,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap298f3d97-56')
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.838 186666 INFO nova.virt.libvirt.driver [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Deleting instance files /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d_del
Feb 19 19:54:44 compute-0 nova_compute[186662]: 2026-02-19 19:54:44.838 186666 INFO nova.virt.libvirt.driver [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Deletion of /var/lib/nova/instances/eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d_del complete
Feb 19 19:54:45 compute-0 nova_compute[186662]: 2026-02-19 19:54:45.114 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:54:45 compute-0 nova_compute[186662]: 2026-02-19 19:54:45.350 186666 INFO nova.compute.manager [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Took 1.28 seconds to destroy the instance on the hypervisor.
Feb 19 19:54:45 compute-0 nova_compute[186662]: 2026-02-19 19:54:45.350 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:54:45 compute-0 nova_compute[186662]: 2026-02-19 19:54:45.351 186666 DEBUG nova.compute.manager [-] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:54:45 compute-0 nova_compute[186662]: 2026-02-19 19:54:45.351 186666 DEBUG nova.network.neutron [-] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:54:45 compute-0 nova_compute[186662]: 2026-02-19 19:54:45.351 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:45 compute-0 nova_compute[186662]: 2026-02-19 19:54:45.626 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:54:45 compute-0 nova_compute[186662]: 2026-02-19 19:54:45.626 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.598s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:45 compute-0 nova_compute[186662]: 2026-02-19 19:54:45.666 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:54:46 compute-0 nova_compute[186662]: 2026-02-19 19:54:46.041 186666 DEBUG nova.compute.manager [req-3dda1aa2-73d0-4d8e-9aff-6986444b3a9c req-0f0b33cf-9dfe-49de-9858-8e3df54ffe04 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Received event network-vif-deleted-298f3d97-567c-49be-9314-19ab2839b2b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:54:46 compute-0 nova_compute[186662]: 2026-02-19 19:54:46.041 186666 INFO nova.compute.manager [req-3dda1aa2-73d0-4d8e-9aff-6986444b3a9c req-0f0b33cf-9dfe-49de-9858-8e3df54ffe04 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Neutron deleted interface 298f3d97-567c-49be-9314-19ab2839b2b8; detaching it from the instance and deleting it from the info cache
Feb 19 19:54:46 compute-0 nova_compute[186662]: 2026-02-19 19:54:46.041 186666 DEBUG nova.network.neutron [req-3dda1aa2-73d0-4d8e-9aff-6986444b3a9c req-0f0b33cf-9dfe-49de-9858-8e3df54ffe04 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:54:46 compute-0 podman[218845]: 2026-02-19 19:54:46.278392504 +0000 UTC m=+0.049936432 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Feb 19 19:54:46 compute-0 nova_compute[186662]: 2026-02-19 19:54:46.351 186666 DEBUG nova.compute.manager [req-5cdf0933-f842-44c6-8d5c-196ede39e074 req-5b59d0f3-25d4-4697-8fb7-2b528322e9a6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Received event network-vif-unplugged-298f3d97-567c-49be-9314-19ab2839b2b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:54:46 compute-0 nova_compute[186662]: 2026-02-19 19:54:46.352 186666 DEBUG oslo_concurrency.lockutils [req-5cdf0933-f842-44c6-8d5c-196ede39e074 req-5b59d0f3-25d4-4697-8fb7-2b528322e9a6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:46 compute-0 nova_compute[186662]: 2026-02-19 19:54:46.352 186666 DEBUG oslo_concurrency.lockutils [req-5cdf0933-f842-44c6-8d5c-196ede39e074 req-5b59d0f3-25d4-4697-8fb7-2b528322e9a6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:46 compute-0 nova_compute[186662]: 2026-02-19 19:54:46.352 186666 DEBUG oslo_concurrency.lockutils [req-5cdf0933-f842-44c6-8d5c-196ede39e074 req-5b59d0f3-25d4-4697-8fb7-2b528322e9a6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:46 compute-0 nova_compute[186662]: 2026-02-19 19:54:46.352 186666 DEBUG nova.compute.manager [req-5cdf0933-f842-44c6-8d5c-196ede39e074 req-5b59d0f3-25d4-4697-8fb7-2b528322e9a6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] No waiting events found dispatching network-vif-unplugged-298f3d97-567c-49be-9314-19ab2839b2b8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:54:46 compute-0 nova_compute[186662]: 2026-02-19 19:54:46.352 186666 DEBUG nova.compute.manager [req-5cdf0933-f842-44c6-8d5c-196ede39e074 req-5b59d0f3-25d4-4697-8fb7-2b528322e9a6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Received event network-vif-unplugged-298f3d97-567c-49be-9314-19ab2839b2b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:54:46 compute-0 nova_compute[186662]: 2026-02-19 19:54:46.443 186666 DEBUG nova.network.neutron [-] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:54:46 compute-0 nova_compute[186662]: 2026-02-19 19:54:46.547 186666 DEBUG nova.compute.manager [req-3dda1aa2-73d0-4d8e-9aff-6986444b3a9c req-0f0b33cf-9dfe-49de-9858-8e3df54ffe04 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Detach interface failed, port_id=298f3d97-567c-49be-9314-19ab2839b2b8, reason: Instance eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:54:46 compute-0 nova_compute[186662]: 2026-02-19 19:54:46.716 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:46 compute-0 nova_compute[186662]: 2026-02-19 19:54:46.947 186666 INFO nova.compute.manager [-] [instance: eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d] Took 1.60 seconds to deallocate network for instance.
Feb 19 19:54:47 compute-0 nova_compute[186662]: 2026-02-19 19:54:47.469 186666 DEBUG oslo_concurrency.lockutils [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:47 compute-0 nova_compute[186662]: 2026-02-19 19:54:47.470 186666 DEBUG oslo_concurrency.lockutils [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:47 compute-0 nova_compute[186662]: 2026-02-19 19:54:47.508 186666 DEBUG nova.compute.provider_tree [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:54:48 compute-0 nova_compute[186662]: 2026-02-19 19:54:48.017 186666 DEBUG nova.scheduler.client.report [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:54:48 compute-0 nova_compute[186662]: 2026-02-19 19:54:48.529 186666 DEBUG oslo_concurrency.lockutils [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.059s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:48 compute-0 nova_compute[186662]: 2026-02-19 19:54:48.550 186666 INFO nova.scheduler.client.report [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Deleted allocations for instance eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d
Feb 19 19:54:49 compute-0 nova_compute[186662]: 2026-02-19 19:54:49.576 186666 DEBUG oslo_concurrency.lockutils [None req-d5f5ce79-64b6-40dc-aa09-9101e914b59c 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "eb844d41-9cb6-4e72-bf27-4fb42fe5fd5d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.037s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:49 compute-0 nova_compute[186662]: 2026-02-19 19:54:49.627 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:54:49 compute-0 nova_compute[186662]: 2026-02-19 19:54:49.836 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:51 compute-0 podman[218866]: 2026-02-19 19:54:51.270384727 +0000 UTC m=+0.042246966 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, release=1770267347, container_name=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Feb 19 19:54:51 compute-0 nova_compute[186662]: 2026-02-19 19:54:51.757 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:53 compute-0 podman[218887]: 2026-02-19 19:54:53.288610109 +0000 UTC m=+0.066670308 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216)
Feb 19 19:54:54 compute-0 nova_compute[186662]: 2026-02-19 19:54:54.838 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:56 compute-0 podman[218915]: 2026-02-19 19:54:56.272542256 +0000 UTC m=+0.047476503 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:54:56 compute-0 nova_compute[186662]: 2026-02-19 19:54:56.759 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:54:57 compute-0 nova_compute[186662]: 2026-02-19 19:54:57.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:54:57 compute-0 nova_compute[186662]: 2026-02-19 19:54:57.576 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:57 compute-0 nova_compute[186662]: 2026-02-19 19:54:57.578 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:57 compute-0 nova_compute[186662]: 2026-02-19 19:54:57.578 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:57 compute-0 nova_compute[186662]: 2026-02-19 19:54:57.578 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:54:57 compute-0 nova_compute[186662]: 2026-02-19 19:54:57.578 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:54:57 compute-0 nova_compute[186662]: 2026-02-19 19:54:57.579 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:54:58 compute-0 nova_compute[186662]: 2026-02-19 19:54:58.589 186666 DEBUG nova.virt.libvirt.imagecache [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:314
Feb 19 19:54:58 compute-0 nova_compute[186662]: 2026-02-19 19:54:58.590 186666 WARNING nova.virt.libvirt.imagecache [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c
Feb 19 19:54:58 compute-0 nova_compute[186662]: 2026-02-19 19:54:58.590 186666 INFO nova.virt.libvirt.imagecache [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Removable base files: /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c
Feb 19 19:54:58 compute-0 nova_compute[186662]: 2026-02-19 19:54:58.590 186666 INFO nova.virt.libvirt.imagecache [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c
Feb 19 19:54:58 compute-0 nova_compute[186662]: 2026-02-19 19:54:58.590 186666 DEBUG nova.virt.libvirt.imagecache [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:350
Feb 19 19:54:58 compute-0 nova_compute[186662]: 2026-02-19 19:54:58.591 186666 DEBUG nova.virt.libvirt.imagecache [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:299
Feb 19 19:54:58 compute-0 nova_compute[186662]: 2026-02-19 19:54:58.591 186666 DEBUG nova.virt.libvirt.imagecache [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:284
Feb 19 19:54:59 compute-0 podman[196025]: time="2026-02-19T19:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:54:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:54:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Feb 19 19:54:59 compute-0 nova_compute[186662]: 2026-02-19 19:54:59.850 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:01 compute-0 openstack_network_exporter[198916]: ERROR   19:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:55:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:55:01 compute-0 openstack_network_exporter[198916]: ERROR   19:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:55:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:55:01 compute-0 nova_compute[186662]: 2026-02-19 19:55:01.761 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:04 compute-0 nova_compute[186662]: 2026-02-19 19:55:04.851 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:06 compute-0 nova_compute[186662]: 2026-02-19 19:55:06.763 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:09 compute-0 nova_compute[186662]: 2026-02-19 19:55:09.853 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:11 compute-0 nova_compute[186662]: 2026-02-19 19:55:11.763 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:13 compute-0 nova_compute[186662]: 2026-02-19 19:55:13.501 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "15173393-c568-47d6-8cec-a82e80673bb2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:55:13 compute-0 nova_compute[186662]: 2026-02-19 19:55:13.501 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:55:14 compute-0 nova_compute[186662]: 2026-02-19 19:55:14.007 186666 DEBUG nova.compute.manager [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Feb 19 19:55:14 compute-0 nova_compute[186662]: 2026-02-19 19:55:14.566 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:55:14 compute-0 nova_compute[186662]: 2026-02-19 19:55:14.566 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:55:14 compute-0 nova_compute[186662]: 2026-02-19 19:55:14.572 186666 DEBUG nova.virt.hardware [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Feb 19 19:55:14 compute-0 nova_compute[186662]: 2026-02-19 19:55:14.572 186666 INFO nova.compute.claims [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Claim successful on node compute-0.ctlplane.example.com
Feb 19 19:55:14 compute-0 nova_compute[186662]: 2026-02-19 19:55:14.854 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:15 compute-0 nova_compute[186662]: 2026-02-19 19:55:15.614 186666 DEBUG nova.compute.provider_tree [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:55:16 compute-0 nova_compute[186662]: 2026-02-19 19:55:16.120 186666 DEBUG nova.scheduler.client.report [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:55:16 compute-0 nova_compute[186662]: 2026-02-19 19:55:16.627 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.061s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:55:16 compute-0 nova_compute[186662]: 2026-02-19 19:55:16.628 186666 DEBUG nova.compute.manager [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Feb 19 19:55:16 compute-0 nova_compute[186662]: 2026-02-19 19:55:16.766 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:17 compute-0 nova_compute[186662]: 2026-02-19 19:55:17.139 186666 DEBUG nova.compute.manager [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Feb 19 19:55:17 compute-0 nova_compute[186662]: 2026-02-19 19:55:17.140 186666 DEBUG nova.network.neutron [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Feb 19 19:55:17 compute-0 nova_compute[186662]: 2026-02-19 19:55:17.140 186666 WARNING neutronclient.v2_0.client [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:55:17 compute-0 nova_compute[186662]: 2026-02-19 19:55:17.140 186666 WARNING neutronclient.v2_0.client [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:55:17 compute-0 podman[218940]: 2026-02-19 19:55:17.301610786 +0000 UTC m=+0.073012981 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:55:17 compute-0 nova_compute[186662]: 2026-02-19 19:55:17.654 186666 INFO nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 19 19:55:18 compute-0 nova_compute[186662]: 2026-02-19 19:55:18.160 186666 DEBUG nova.compute.manager [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Feb 19 19:55:18 compute-0 nova_compute[186662]: 2026-02-19 19:55:18.760 186666 DEBUG nova.network.neutron [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Successfully created port: 0fa7e740-cbda-4159-98cf-b9a37242e117 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.271 186666 DEBUG nova.compute.manager [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.272 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.273 186666 INFO nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Creating image(s)
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.273 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "/var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.273 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "/var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.274 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "/var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.274 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.277 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.278 186666 DEBUG oslo_concurrency.processutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.338 186666 DEBUG oslo_concurrency.processutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.339 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.339 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.340 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.344 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.344 186666 DEBUG oslo_concurrency.processutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.397 186666 DEBUG oslo_concurrency.processutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.398 186666 DEBUG oslo_concurrency.processutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.424 186666 DEBUG oslo_concurrency.processutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk 1073741824" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.425 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.086s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.425 186666 DEBUG oslo_concurrency.processutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.468 186666 DEBUG oslo_concurrency.processutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.469 186666 DEBUG nova.virt.disk.api [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Checking if we can resize image /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.469 186666 DEBUG oslo_concurrency.processutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.516 186666 DEBUG oslo_concurrency.processutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.516 186666 DEBUG nova.virt.disk.api [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Cannot resize image /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.517 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.517 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Ensure instance console log exists: /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.517 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.518 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.518 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:55:19 compute-0 nova_compute[186662]: 2026-02-19 19:55:19.857 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:20 compute-0 nova_compute[186662]: 2026-02-19 19:55:20.556 186666 DEBUG nova.network.neutron [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Successfully updated port: 0fa7e740-cbda-4159-98cf-b9a37242e117 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Feb 19 19:55:20 compute-0 nova_compute[186662]: 2026-02-19 19:55:20.617 186666 DEBUG nova.compute.manager [req-ff8e4f9e-a887-4e97-b16f-2cba222cb39b req-676ffa4f-c8f4-4c91-9e50-5d9291e9b210 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Received event network-changed-0fa7e740-cbda-4159-98cf-b9a37242e117 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:55:20 compute-0 nova_compute[186662]: 2026-02-19 19:55:20.617 186666 DEBUG nova.compute.manager [req-ff8e4f9e-a887-4e97-b16f-2cba222cb39b req-676ffa4f-c8f4-4c91-9e50-5d9291e9b210 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Refreshing instance network info cache due to event network-changed-0fa7e740-cbda-4159-98cf-b9a37242e117. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Feb 19 19:55:20 compute-0 nova_compute[186662]: 2026-02-19 19:55:20.618 186666 DEBUG oslo_concurrency.lockutils [req-ff8e4f9e-a887-4e97-b16f-2cba222cb39b req-676ffa4f-c8f4-4c91-9e50-5d9291e9b210 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-15173393-c568-47d6-8cec-a82e80673bb2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:55:20 compute-0 nova_compute[186662]: 2026-02-19 19:55:20.618 186666 DEBUG oslo_concurrency.lockutils [req-ff8e4f9e-a887-4e97-b16f-2cba222cb39b req-676ffa4f-c8f4-4c91-9e50-5d9291e9b210 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-15173393-c568-47d6-8cec-a82e80673bb2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:55:20 compute-0 nova_compute[186662]: 2026-02-19 19:55:20.618 186666 DEBUG nova.network.neutron [req-ff8e4f9e-a887-4e97-b16f-2cba222cb39b req-676ffa4f-c8f4-4c91-9e50-5d9291e9b210 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Refreshing network info cache for port 0fa7e740-cbda-4159-98cf-b9a37242e117 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Feb 19 19:55:21 compute-0 nova_compute[186662]: 2026-02-19 19:55:21.062 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "refresh_cache-15173393-c568-47d6-8cec-a82e80673bb2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:55:21 compute-0 nova_compute[186662]: 2026-02-19 19:55:21.123 186666 WARNING neutronclient.v2_0.client [req-ff8e4f9e-a887-4e97-b16f-2cba222cb39b req-676ffa4f-c8f4-4c91-9e50-5d9291e9b210 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:55:21 compute-0 nova_compute[186662]: 2026-02-19 19:55:21.391 186666 DEBUG nova.network.neutron [req-ff8e4f9e-a887-4e97-b16f-2cba222cb39b req-676ffa4f-c8f4-4c91-9e50-5d9291e9b210 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:55:21 compute-0 nova_compute[186662]: 2026-02-19 19:55:21.595 186666 DEBUG nova.network.neutron [req-ff8e4f9e-a887-4e97-b16f-2cba222cb39b req-676ffa4f-c8f4-4c91-9e50-5d9291e9b210 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:55:21 compute-0 nova_compute[186662]: 2026-02-19 19:55:21.817 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:22 compute-0 nova_compute[186662]: 2026-02-19 19:55:22.100 186666 DEBUG oslo_concurrency.lockutils [req-ff8e4f9e-a887-4e97-b16f-2cba222cb39b req-676ffa4f-c8f4-4c91-9e50-5d9291e9b210 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-15173393-c568-47d6-8cec-a82e80673bb2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:55:22 compute-0 nova_compute[186662]: 2026-02-19 19:55:22.101 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquired lock "refresh_cache-15173393-c568-47d6-8cec-a82e80673bb2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:55:22 compute-0 nova_compute[186662]: 2026-02-19 19:55:22.101 186666 DEBUG nova.network.neutron [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:55:22 compute-0 podman[218975]: 2026-02-19 19:55:22.290580926 +0000 UTC m=+0.066595036 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter)
Feb 19 19:55:23 compute-0 nova_compute[186662]: 2026-02-19 19:55:23.396 186666 DEBUG nova.network.neutron [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Feb 19 19:55:23 compute-0 nova_compute[186662]: 2026-02-19 19:55:23.587 186666 WARNING neutronclient.v2_0.client [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:55:24 compute-0 podman[218996]: 2026-02-19 19:55:24.307354734 +0000 UTC m=+0.077450879 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.401 186666 DEBUG nova.network.neutron [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Updating instance_info_cache with network_info: [{"id": "0fa7e740-cbda-4159-98cf-b9a37242e117", "address": "fa:16:3e:32:86:c3", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fa7e740-cb", "ovs_interfaceid": "0fa7e740-cbda-4159-98cf-b9a37242e117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.860 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.909 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Releasing lock "refresh_cache-15173393-c568-47d6-8cec-a82e80673bb2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.910 186666 DEBUG nova.compute.manager [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Instance network_info: |[{"id": "0fa7e740-cbda-4159-98cf-b9a37242e117", "address": "fa:16:3e:32:86:c3", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fa7e740-cb", "ovs_interfaceid": "0fa7e740-cbda-4159-98cf-b9a37242e117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.912 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Start _get_guest_xml network_info=[{"id": "0fa7e740-cbda-4159-98cf-b9a37242e117", "address": "fa:16:3e:32:86:c3", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fa7e740-cb", "ovs_interfaceid": "0fa7e740-cbda-4159-98cf-b9a37242e117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'size': 0, 'image_id': 'b8007ea6-afa7-4c5a-abc0-d9d7338ce087'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.917 186666 WARNING nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.919 186666 DEBUG nova.virt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-1139573967', uuid='15173393-c568-47d6-8cec-a82e80673bb2'), owner=OwnerMeta(userid='6f190c8d209a43b19d4cba5936ab90e0', username='tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin', projectid='dafcf6090521473590cdb432a889739e', projectname='tempest-TestExecuteZoneMigrationStrategy-81034023'), image=ImageMeta(id='b8007ea6-afa7-4c5a-abc0-d9d7338ce087', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "0fa7e740-cbda-4159-98cf-b9a37242e117", "address": "fa:16:3e:32:86:c3", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fa7e740-cb", "ovs_interfaceid": "0fa7e740-cbda-4159-98cf-b9a37242e117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1771530924.9198632) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.926 186666 DEBUG nova.virt.libvirt.host [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.926 186666 DEBUG nova.virt.libvirt.host [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.929 186666 DEBUG nova.virt.libvirt.host [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.930 186666 DEBUG nova.virt.libvirt.host [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.932 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.933 186666 DEBUG nova.virt.hardware [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-19T19:19:33Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3881472c-99fb-4fe5-ab4d-bf6223e45537',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-19T19:19:35Z,direct_url=<?>,disk_format='qcow2',id=b8007ea6-afa7-4c5a-abc0-d9d7338ce087,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='084bf37190834c4d9a8f0459d9d05ec7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-19T19:19:36Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.934 186666 DEBUG nova.virt.hardware [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.934 186666 DEBUG nova.virt.hardware [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.935 186666 DEBUG nova.virt.hardware [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.936 186666 DEBUG nova.virt.hardware [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.936 186666 DEBUG nova.virt.hardware [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.937 186666 DEBUG nova.virt.hardware [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.937 186666 DEBUG nova.virt.hardware [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.938 186666 DEBUG nova.virt.hardware [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.938 186666 DEBUG nova.virt.hardware [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.939 186666 DEBUG nova.virt.hardware [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.947 186666 DEBUG nova.virt.libvirt.vif [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:55:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1139573967',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1139573967',id=33,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dafcf6090521473590cdb432a889739e',ramdisk_id='',reservation_id='r-4ldo001t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-81034023',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:55:18Z,user_data=None,user_id='6f190c8d209a43b19d4cba5936ab90e0',uuid=15173393-c568-47d6-8cec-a82e80673bb2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0fa7e740-cbda-4159-98cf-b9a37242e117", "address": "fa:16:3e:32:86:c3", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fa7e740-cb", "ovs_interfaceid": "0fa7e740-cbda-4159-98cf-b9a37242e117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.948 186666 DEBUG nova.network.os_vif_util [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converting VIF {"id": "0fa7e740-cbda-4159-98cf-b9a37242e117", "address": "fa:16:3e:32:86:c3", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fa7e740-cb", "ovs_interfaceid": "0fa7e740-cbda-4159-98cf-b9a37242e117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.950 186666 DEBUG nova.network.os_vif_util [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:86:c3,bridge_name='br-int',has_traffic_filtering=True,id=0fa7e740-cbda-4159-98cf-b9a37242e117,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fa7e740-cb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:55:24 compute-0 nova_compute[186662]: 2026-02-19 19:55:24.951 186666 DEBUG nova.objects.instance [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lazy-loading 'pci_devices' on Instance uuid 15173393-c568-47d6-8cec-a82e80673bb2 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.460 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] End _get_guest_xml xml=<domain type="kvm">
Feb 19 19:55:25 compute-0 nova_compute[186662]:   <uuid>15173393-c568-47d6-8cec-a82e80673bb2</uuid>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   <name>instance-00000021</name>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   <memory>131072</memory>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   <vcpu>1</vcpu>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   <metadata>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1139573967</nova:name>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <nova:creationTime>2026-02-19 19:55:24</nova:creationTime>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <nova:flavor name="m1.nano" id="3881472c-99fb-4fe5-ab4d-bf6223e45537">
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:memory>128</nova:memory>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:disk>1</nova:disk>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:swap>0</nova:swap>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:ephemeral>0</nova:ephemeral>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:vcpus>1</nova:vcpus>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:extraSpecs>
Feb 19 19:55:25 compute-0 nova_compute[186662]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         </nova:extraSpecs>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       </nova:flavor>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <nova:image uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087">
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:containerFormat>bare</nova:containerFormat>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:diskFormat>qcow2</nova:diskFormat>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:minDisk>1</nova:minDisk>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:minRam>0</nova:minRam>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:properties>
Feb 19 19:55:25 compute-0 nova_compute[186662]:           <nova:property name="hw_rng_model">virtio</nova:property>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         </nova:properties>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       </nova:image>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <nova:owner>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:user uuid="6f190c8d209a43b19d4cba5936ab90e0">tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin</nova:user>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:project uuid="dafcf6090521473590cdb432a889739e">tempest-TestExecuteZoneMigrationStrategy-81034023</nova:project>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       </nova:owner>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <nova:root type="image" uuid="b8007ea6-afa7-4c5a-abc0-d9d7338ce087"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <nova:ports>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         <nova:port uuid="0fa7e740-cbda-4159-98cf-b9a37242e117">
Feb 19 19:55:25 compute-0 nova_compute[186662]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:         </nova:port>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       </nova:ports>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     </nova:instance>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   </metadata>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   <sysinfo type="smbios">
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <system>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <entry name="manufacturer">RDO</entry>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <entry name="product">OpenStack Compute</entry>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <entry name="serial">15173393-c568-47d6-8cec-a82e80673bb2</entry>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <entry name="uuid">15173393-c568-47d6-8cec-a82e80673bb2</entry>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <entry name="family">Virtual Machine</entry>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     </system>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   </sysinfo>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   <os>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <boot dev="hd"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <smbios mode="sysinfo"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   </os>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   <features>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <acpi/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <apic/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <vmcoreinfo/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   </features>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   <clock offset="utc">
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <timer name="pit" tickpolicy="delay"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <timer name="hpet" present="no"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   </clock>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   <cpu mode="custom" match="exact">
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <model>Nehalem</model>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <topology sockets="1" cores="1" threads="1"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   </cpu>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   <devices>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <disk type="file" device="disk">
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <target dev="vda" bus="virtio"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <disk type="file" device="cdrom">
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <driver name="qemu" type="raw" cache="none"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <source file="/var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk.config"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <target dev="sda" bus="sata"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     </disk>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <interface type="ethernet">
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <mac address="fa:16:3e:32:86:c3"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <driver name="vhost" rx_queue_size="512"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <mtu size="1442"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <target dev="tap0fa7e740-cb"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     </interface>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <serial type="pty">
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <log file="/var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/console.log" append="off"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     </serial>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <video>
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <model type="virtio"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     </video>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <input type="tablet" bus="usb"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <rng model="virtio">
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <backend model="random">/dev/urandom</backend>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     </rng>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="pci" model="pcie-root-port"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <controller type="usb" index="0"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Feb 19 19:55:25 compute-0 nova_compute[186662]:       <stats period="10"/>
Feb 19 19:55:25 compute-0 nova_compute[186662]:     </memballoon>
Feb 19 19:55:25 compute-0 nova_compute[186662]:   </devices>
Feb 19 19:55:25 compute-0 nova_compute[186662]: </domain>
Feb 19 19:55:25 compute-0 nova_compute[186662]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.461 186666 DEBUG nova.compute.manager [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Preparing to wait for external event network-vif-plugged-0fa7e740-cbda-4159-98cf-b9a37242e117 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.461 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "15173393-c568-47d6-8cec-a82e80673bb2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.461 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.462 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.462 186666 DEBUG nova.virt.libvirt.vif [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-02-19T19:55:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1139573967',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1139573967',id=33,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dafcf6090521473590cdb432a889739e',ramdisk_id='',reservation_id='r-4ldo001t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-81034023',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:55:18Z,user_data=None,user_id='6f190c8d209a43b19d4cba5936ab90e0',uuid=15173393-c568-47d6-8cec-a82e80673bb2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0fa7e740-cbda-4159-98cf-b9a37242e117", "address": "fa:16:3e:32:86:c3", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fa7e740-cb", "ovs_interfaceid": "0fa7e740-cbda-4159-98cf-b9a37242e117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.463 186666 DEBUG nova.network.os_vif_util [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converting VIF {"id": "0fa7e740-cbda-4159-98cf-b9a37242e117", "address": "fa:16:3e:32:86:c3", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fa7e740-cb", "ovs_interfaceid": "0fa7e740-cbda-4159-98cf-b9a37242e117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.464 186666 DEBUG nova.network.os_vif_util [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:86:c3,bridge_name='br-int',has_traffic_filtering=True,id=0fa7e740-cbda-4159-98cf-b9a37242e117,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fa7e740-cb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.464 186666 DEBUG os_vif [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:86:c3,bridge_name='br-int',has_traffic_filtering=True,id=0fa7e740-cbda-4159-98cf-b9a37242e117,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fa7e740-cb') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.464 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.465 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.465 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.465 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.466 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ab3e58fa-4db8-5260-9578-97922a8ca965', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.467 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.469 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.471 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.472 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0fa7e740-cb, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.472 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap0fa7e740-cb, col_values=(('qos', UUID('a4db38e5-aecb-44eb-be97-8fbc5cd8dbec')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.472 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap0fa7e740-cb, col_values=(('external_ids', {'iface-id': '0fa7e740-cbda-4159-98cf-b9a37242e117', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:86:c3', 'vm-uuid': '15173393-c568-47d6-8cec-a82e80673bb2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.473 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:25 compute-0 NetworkManager[56519]: <info>  [1771530925.4745] manager: (tap0fa7e740-cb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.476 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.480 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:25 compute-0 nova_compute[186662]: 2026-02-19 19:55:25.482 186666 INFO os_vif [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:86:c3,bridge_name='br-int',has_traffic_filtering=True,id=0fa7e740-cbda-4159-98cf-b9a37242e117,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fa7e740-cb')
Feb 19 19:55:26 compute-0 nova_compute[186662]: 2026-02-19 19:55:26.819 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:27 compute-0 nova_compute[186662]: 2026-02-19 19:55:27.033 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:55:27 compute-0 nova_compute[186662]: 2026-02-19 19:55:27.033 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Feb 19 19:55:27 compute-0 nova_compute[186662]: 2026-02-19 19:55:27.034 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] No VIF found with MAC fa:16:3e:32:86:c3, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Feb 19 19:55:27 compute-0 nova_compute[186662]: 2026-02-19 19:55:27.034 186666 INFO nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Using config drive
Feb 19 19:55:27 compute-0 podman[219025]: 2026-02-19 19:55:27.262303238 +0000 UTC m=+0.042114723 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 19:55:27 compute-0 nova_compute[186662]: 2026-02-19 19:55:27.543 186666 WARNING neutronclient.v2_0.client [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:55:27 compute-0 nova_compute[186662]: 2026-02-19 19:55:27.683 186666 INFO nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Creating config drive at /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk.config
Feb 19 19:55:27 compute-0 nova_compute[186662]: 2026-02-19 19:55:27.687 186666 DEBUG oslo_concurrency.processutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpui4iybn6 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:27 compute-0 nova_compute[186662]: 2026-02-19 19:55:27.805 186666 DEBUG oslo_concurrency.processutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpui4iybn6" returned: 0 in 0.119s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:27 compute-0 kernel: tap0fa7e740-cb: entered promiscuous mode
Feb 19 19:55:27 compute-0 NetworkManager[56519]: <info>  [1771530927.8730] manager: (tap0fa7e740-cb): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Feb 19 19:55:27 compute-0 nova_compute[186662]: 2026-02-19 19:55:27.873 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:27 compute-0 ovn_controller[96653]: 2026-02-19T19:55:27Z|00251|binding|INFO|Claiming lport 0fa7e740-cbda-4159-98cf-b9a37242e117 for this chassis.
Feb 19 19:55:27 compute-0 ovn_controller[96653]: 2026-02-19T19:55:27Z|00252|binding|INFO|0fa7e740-cbda-4159-98cf-b9a37242e117: Claiming fa:16:3e:32:86:c3 10.100.0.9
Feb 19 19:55:27 compute-0 nova_compute[186662]: 2026-02-19 19:55:27.878 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.881 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:86:c3 10.100.0.9'], port_security=['fa:16:3e:32:86:c3 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '15173393-c568-47d6-8cec-a82e80673bb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dafcf6090521473590cdb432a889739e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8f4a8406-fbc3-4742-80fc-e9181ef138d4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ddef6f8-604e-4b41-9f83-1182f76298b6, chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=0fa7e740-cbda-4159-98cf-b9a37242e117) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.882 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 0fa7e740-cbda-4159-98cf-b9a37242e117 in datapath 2cb93231-0e3e-4efd-8b0c-4366500dbd16 bound to our chassis
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.884 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2cb93231-0e3e-4efd-8b0c-4366500dbd16
Feb 19 19:55:27 compute-0 nova_compute[186662]: 2026-02-19 19:55:27.884 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:27 compute-0 ovn_controller[96653]: 2026-02-19T19:55:27Z|00253|binding|INFO|Setting lport 0fa7e740-cbda-4159-98cf-b9a37242e117 ovn-installed in OVS
Feb 19 19:55:27 compute-0 ovn_controller[96653]: 2026-02-19T19:55:27Z|00254|binding|INFO|Setting lport 0fa7e740-cbda-4159-98cf-b9a37242e117 up in Southbound
Feb 19 19:55:27 compute-0 nova_compute[186662]: 2026-02-19 19:55:27.889 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:27 compute-0 nova_compute[186662]: 2026-02-19 19:55:27.891 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.891 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c047ebf7-7b51-41c1-ad18-7e595c4e1564]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.893 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2cb93231-01 in ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.896 207540 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2cb93231-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.896 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[e6cb7de7-ee4d-422f-ada2-dd1e5ef86657]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.897 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[ad1362dc-cd7c-4eed-8d5a-ac08607583f3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:27 compute-0 systemd-udevd[219069]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.910 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[4c227f1b-dba5-47ef-8f9f-3f058ec3d7dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:27 compute-0 systemd-machined[156014]: New machine qemu-24-instance-00000021.
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.917 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[96639105-18e0-4276-958d-d149d4ba4cb4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:27 compute-0 NetworkManager[56519]: <info>  [1771530927.9214] device (tap0fa7e740-cb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:55:27 compute-0 NetworkManager[56519]: <info>  [1771530927.9220] device (tap0fa7e740-cb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:55:27 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-00000021.
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.943 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[6f0e2a0b-1564-4d0e-8d54-427cd4864a5e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.947 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e9dadc-73b1-482b-9fe7-0d61e05ba9f6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:27 compute-0 NetworkManager[56519]: <info>  [1771530927.9498] manager: (tap2cb93231-00): new Veth device (/org/freedesktop/NetworkManager/Devices/96)
Feb 19 19:55:27 compute-0 systemd-udevd[219072]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.976 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[56b80037-06ae-4b99-9290-c7412b3e982f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:27.978 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[be7e91d8-5ebd-4b71-b296-9b293f92b562]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:28 compute-0 NetworkManager[56519]: <info>  [1771530928.0004] device (tap2cb93231-00): carrier: link connected
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.008 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[a47cb531-2005-4244-8991-94d925caa5e8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.025 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[83c390cf-c5d2-4762-9bab-636c76f58455]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2cb93231-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:50:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519756, 'reachable_time': 18572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219100, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.042 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[2da9b6d7-f442-4b28-87c1-c8b836949e91]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feff:5071'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519756, 'tstamp': 519756}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219101, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.061 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[462772ce-b222-4695-b996-c3a2f81b1771]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2cb93231-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:50:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519756, 'reachable_time': 18572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219102, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.062 186666 DEBUG nova.compute.manager [req-9edd6c2b-0d0b-4b20-9711-a61e600d36cd req-5831f468-86e3-466e-9ef9-9fc1886dac60 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Received event network-vif-plugged-0fa7e740-cbda-4159-98cf-b9a37242e117 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.062 186666 DEBUG oslo_concurrency.lockutils [req-9edd6c2b-0d0b-4b20-9711-a61e600d36cd req-5831f468-86e3-466e-9ef9-9fc1886dac60 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "15173393-c568-47d6-8cec-a82e80673bb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.062 186666 DEBUG oslo_concurrency.lockutils [req-9edd6c2b-0d0b-4b20-9711-a61e600d36cd req-5831f468-86e3-466e-9ef9-9fc1886dac60 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.062 186666 DEBUG oslo_concurrency.lockutils [req-9edd6c2b-0d0b-4b20-9711-a61e600d36cd req-5831f468-86e3-466e-9ef9-9fc1886dac60 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.062 186666 DEBUG nova.compute.manager [req-9edd6c2b-0d0b-4b20-9711-a61e600d36cd req-5831f468-86e3-466e-9ef9-9fc1886dac60 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Processing event network-vif-plugged-0fa7e740-cbda-4159-98cf-b9a37242e117 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.094 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[eeb5e880-30eb-457c-9788-9c4a348c8fe9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.146 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba4c846-3ba2-4b4e-aa45-cbf4dc71e069]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.149 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cb93231-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.149 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.149 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2cb93231-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.152 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:28 compute-0 kernel: tap2cb93231-00: entered promiscuous mode
Feb 19 19:55:28 compute-0 NetworkManager[56519]: <info>  [1771530928.1531] manager: (tap2cb93231-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.154 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2cb93231-00, col_values=(('external_ids', {'iface-id': '4d5db50f-b83e-4965-bddf-4fd62a569e63'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:28 compute-0 ovn_controller[96653]: 2026-02-19T19:55:28Z|00255|binding|INFO|Releasing lport 4d5db50f-b83e-4965-bddf-4fd62a569e63 from this chassis (sb_readonly=0)
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.155 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.157 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7fe78b08-7812-42cc-aa71-3d13373a43b4]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.157 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.157 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.157 105986 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 2cb93231-0e3e-4efd-8b0c-4366500dbd16 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.157 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.158 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[1781facd-6e64-44b4-b276-706acb2a7eff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.158 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.158 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[d926410e-a3f0-4217-81de-72b691f4a18e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.159 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.159 105986 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: global
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     log         /dev/log local0 debug
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     log-tag     haproxy-metadata-proxy-2cb93231-0e3e-4efd-8b0c-4366500dbd16
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     user        root
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     group       root
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     maxconn     1024
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     pidfile     /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     daemon
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: defaults
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     log global
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     mode http
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     option httplog
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     option dontlognull
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     option http-server-close
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     option forwardfor
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     retries                 3
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     timeout http-request    30s
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     timeout connect         30s
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     timeout client          32s
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     timeout server          32s
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     timeout http-keep-alive 30s
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: listen listener
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     bind 169.254.169.254:80
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     server metadata /var/lib/neutron/metadata_proxy
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:     http-request add-header X-OVN-Network-ID 2cb93231-0e3e-4efd-8b0c-4366500dbd16
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Feb 19 19:55:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:28.160 105986 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'env', 'PROCESS_TAG=haproxy-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2cb93231-0e3e-4efd-8b0c-4366500dbd16.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.432 186666 DEBUG nova.compute.manager [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.437 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.440 186666 INFO nova.virt.libvirt.driver [-] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Instance spawned successfully.
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.440 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Feb 19 19:55:28 compute-0 podman[219141]: 2026-02-19 19:55:28.550466993 +0000 UTC m=+0.051878499 container create 312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260216)
Feb 19 19:55:28 compute-0 systemd[1]: Started libpod-conmon-312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d.scope.
Feb 19 19:55:28 compute-0 systemd[1]: Started libcrun container.
Feb 19 19:55:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b08b39043be9c560e4dc135cd208f5c8aec6f7ae4db1699aa058823ce629f67b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 19 19:55:28 compute-0 podman[219141]: 2026-02-19 19:55:28.524698507 +0000 UTC m=+0.026110043 image pull dfc4ee2c621ea595c16a7cd7a8233293e80df6b99346b1ce1a7ed22a35aace27 38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Feb 19 19:55:28 compute-0 podman[219141]: 2026-02-19 19:55:28.630927804 +0000 UTC m=+0.132339350 container init 312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0)
Feb 19 19:55:28 compute-0 podman[219141]: 2026-02-19 19:55:28.635454545 +0000 UTC m=+0.136866061 container start 312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 19 19:55:28 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[219156]: [NOTICE]   (219160) : New worker (219162) forked
Feb 19 19:55:28 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[219156]: [NOTICE]   (219160) : Loading success.
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.952 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.952 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.953 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.953 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.954 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:55:28 compute-0 nova_compute[186662]: 2026-02-19 19:55:28.955 186666 DEBUG nova.virt.libvirt.driver [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Feb 19 19:55:29 compute-0 nova_compute[186662]: 2026-02-19 19:55:29.464 186666 INFO nova.compute.manager [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Took 10.19 seconds to spawn the instance on the hypervisor.
Feb 19 19:55:29 compute-0 nova_compute[186662]: 2026-02-19 19:55:29.464 186666 DEBUG nova.compute.manager [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:55:29 compute-0 podman[196025]: time="2026-02-19T19:55:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:55:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:55:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:55:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:55:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2659 "" "Go-http-client/1.1"
Feb 19 19:55:29 compute-0 nova_compute[186662]: 2026-02-19 19:55:29.998 186666 INFO nova.compute.manager [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Took 15.48 seconds to build instance.
Feb 19 19:55:30 compute-0 nova_compute[186662]: 2026-02-19 19:55:30.128 186666 DEBUG nova.compute.manager [req-6710d1a0-1543-4fc6-90dc-71ee8a9af080 req-a0f6133f-b804-4a90-a952-3e9073ca7d52 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Received event network-vif-plugged-0fa7e740-cbda-4159-98cf-b9a37242e117 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:55:30 compute-0 nova_compute[186662]: 2026-02-19 19:55:30.129 186666 DEBUG oslo_concurrency.lockutils [req-6710d1a0-1543-4fc6-90dc-71ee8a9af080 req-a0f6133f-b804-4a90-a952-3e9073ca7d52 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "15173393-c568-47d6-8cec-a82e80673bb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:55:30 compute-0 nova_compute[186662]: 2026-02-19 19:55:30.129 186666 DEBUG oslo_concurrency.lockutils [req-6710d1a0-1543-4fc6-90dc-71ee8a9af080 req-a0f6133f-b804-4a90-a952-3e9073ca7d52 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:55:30 compute-0 nova_compute[186662]: 2026-02-19 19:55:30.129 186666 DEBUG oslo_concurrency.lockutils [req-6710d1a0-1543-4fc6-90dc-71ee8a9af080 req-a0f6133f-b804-4a90-a952-3e9073ca7d52 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:55:30 compute-0 nova_compute[186662]: 2026-02-19 19:55:30.129 186666 DEBUG nova.compute.manager [req-6710d1a0-1543-4fc6-90dc-71ee8a9af080 req-a0f6133f-b804-4a90-a952-3e9073ca7d52 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] No waiting events found dispatching network-vif-plugged-0fa7e740-cbda-4159-98cf-b9a37242e117 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:55:30 compute-0 nova_compute[186662]: 2026-02-19 19:55:30.130 186666 WARNING nova.compute.manager [req-6710d1a0-1543-4fc6-90dc-71ee8a9af080 req-a0f6133f-b804-4a90-a952-3e9073ca7d52 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Received unexpected event network-vif-plugged-0fa7e740-cbda-4159-98cf-b9a37242e117 for instance with vm_state active and task_state None.
Feb 19 19:55:30 compute-0 nova_compute[186662]: 2026-02-19 19:55:30.474 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:30 compute-0 nova_compute[186662]: 2026-02-19 19:55:30.504 186666 DEBUG oslo_concurrency.lockutils [None req-c5a7ed88-812c-4eca-84c9-cb50eacf215e 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:55:31 compute-0 openstack_network_exporter[198916]: ERROR   19:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:55:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:55:31 compute-0 openstack_network_exporter[198916]: ERROR   19:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:55:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:55:31 compute-0 nova_compute[186662]: 2026-02-19 19:55:31.820 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:32.168 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:55:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:32.168 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:55:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:32.168 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:55:32 compute-0 nova_compute[186662]: 2026-02-19 19:55:32.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:55:32 compute-0 nova_compute[186662]: 2026-02-19 19:55:32.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:55:35 compute-0 nova_compute[186662]: 2026-02-19 19:55:35.477 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:36 compute-0 nova_compute[186662]: 2026-02-19 19:55:36.080 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:55:36 compute-0 nova_compute[186662]: 2026-02-19 19:55:36.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:55:36 compute-0 nova_compute[186662]: 2026-02-19 19:55:36.821 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:36 compute-0 sshd-session[219172]: Invalid user nexus from 197.211.55.20 port 33984
Feb 19 19:55:37 compute-0 sshd-session[219172]: Received disconnect from 197.211.55.20 port 33984:11: Bye Bye [preauth]
Feb 19 19:55:37 compute-0 sshd-session[219172]: Disconnected from invalid user nexus 197.211.55.20 port 33984 [preauth]
Feb 19 19:55:38 compute-0 nova_compute[186662]: 2026-02-19 19:55:38.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:55:39 compute-0 nova_compute[186662]: 2026-02-19 19:55:39.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:55:40 compute-0 nova_compute[186662]: 2026-02-19 19:55:40.479 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:40 compute-0 ovn_controller[96653]: 2026-02-19T19:55:40Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:32:86:c3 10.100.0.9
Feb 19 19:55:40 compute-0 ovn_controller[96653]: 2026-02-19T19:55:40Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:32:86:c3 10.100.0.9
Feb 19 19:55:41 compute-0 nova_compute[186662]: 2026-02-19 19:55:41.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:55:41 compute-0 nova_compute[186662]: 2026-02-19 19:55:41.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:55:41 compute-0 nova_compute[186662]: 2026-02-19 19:55:41.824 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:42 compute-0 nova_compute[186662]: 2026-02-19 19:55:42.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:55:42 compute-0 nova_compute[186662]: 2026-02-19 19:55:42.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:55:42 compute-0 nova_compute[186662]: 2026-02-19 19:55:42.617 186666 DEBUG nova.virt.libvirt.driver [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Creating tmpfile /var/lib/nova/instances/tmpoqb5icxe to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Feb 19 19:55:42 compute-0 nova_compute[186662]: 2026-02-19 19:55:42.618 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:55:42 compute-0 nova_compute[186662]: 2026-02-19 19:55:42.714 186666 DEBUG nova.compute.manager [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpoqb5icxe',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Feb 19 19:55:43 compute-0 nova_compute[186662]: 2026-02-19 19:55:43.087 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:55:43 compute-0 nova_compute[186662]: 2026-02-19 19:55:43.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:55:43 compute-0 nova_compute[186662]: 2026-02-19 19:55:43.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:55:43 compute-0 nova_compute[186662]: 2026-02-19 19:55:43.088 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:55:43 compute-0 sshd-session[219190]: Invalid user ubuntu from 45.169.200.254 port 57238
Feb 19 19:55:43 compute-0 sshd-session[219190]: Received disconnect from 45.169.200.254 port 57238:11: Bye Bye [preauth]
Feb 19 19:55:43 compute-0 sshd-session[219190]: Disconnected from invalid user ubuntu 45.169.200.254 port 57238 [preauth]
Feb 19 19:55:44 compute-0 nova_compute[186662]: 2026-02-19 19:55:44.124 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:44 compute-0 nova_compute[186662]: 2026-02-19 19:55:44.206 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:44 compute-0 nova_compute[186662]: 2026-02-19 19:55:44.207 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:44 compute-0 nova_compute[186662]: 2026-02-19 19:55:44.256 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:44 compute-0 nova_compute[186662]: 2026-02-19 19:55:44.392 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:55:44 compute-0 nova_compute[186662]: 2026-02-19 19:55:44.393 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:44 compute-0 nova_compute[186662]: 2026-02-19 19:55:44.408 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:44 compute-0 nova_compute[186662]: 2026-02-19 19:55:44.409 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5598MB free_disk=72.93861389160156GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:55:44 compute-0 nova_compute[186662]: 2026-02-19 19:55:44.409 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:55:44 compute-0 nova_compute[186662]: 2026-02-19 19:55:44.410 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:55:44 compute-0 nova_compute[186662]: 2026-02-19 19:55:44.746 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:55:45 compute-0 nova_compute[186662]: 2026-02-19 19:55:45.442 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 15173393-c568-47d6-8cec-a82e80673bb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Feb 19 19:55:45 compute-0 nova_compute[186662]: 2026-02-19 19:55:45.481 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:45 compute-0 nova_compute[186662]: 2026-02-19 19:55:45.949 186666 WARNING nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Instance 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Feb 19 19:55:45 compute-0 nova_compute[186662]: 2026-02-19 19:55:45.950 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:55:45 compute-0 nova_compute[186662]: 2026-02-19 19:55:45.950 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:55:44 up  1:26,  0 user,  load average: 0.25, 0.15, 0.19\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_dafcf6090521473590cdb432a889739e': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:55:45 compute-0 nova_compute[186662]: 2026-02-19 19:55:45.990 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:55:46 compute-0 nova_compute[186662]: 2026-02-19 19:55:46.496 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:55:46 compute-0 nova_compute[186662]: 2026-02-19 19:55:46.825 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:47 compute-0 nova_compute[186662]: 2026-02-19 19:55:47.008 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:55:47 compute-0 nova_compute[186662]: 2026-02-19 19:55:47.009 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.599s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:55:47 compute-0 sshd-session[219199]: Received disconnect from 106.51.64.128 port 49276:11: Bye Bye [preauth]
Feb 19 19:55:47 compute-0 sshd-session[219199]: Disconnected from authenticating user root 106.51.64.128 port 49276 [preauth]
Feb 19 19:55:48 compute-0 podman[219201]: 2026-02-19 19:55:48.29236479 +0000 UTC m=+0.059482134 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Feb 19 19:55:49 compute-0 nova_compute[186662]: 2026-02-19 19:55:49.010 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:55:49 compute-0 nova_compute[186662]: 2026-02-19 19:55:49.287 186666 DEBUG nova.compute.manager [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpoqb5icxe',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3d326f22-9bd5-4bf2-a7dc-5f425116bfc6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Feb 19 19:55:50 compute-0 nova_compute[186662]: 2026-02-19 19:55:50.304 186666 DEBUG oslo_concurrency.lockutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-3d326f22-9bd5-4bf2-a7dc-5f425116bfc6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:55:50 compute-0 nova_compute[186662]: 2026-02-19 19:55:50.304 186666 DEBUG oslo_concurrency.lockutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-3d326f22-9bd5-4bf2-a7dc-5f425116bfc6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:55:50 compute-0 nova_compute[186662]: 2026-02-19 19:55:50.304 186666 DEBUG nova.network.neutron [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:55:50 compute-0 nova_compute[186662]: 2026-02-19 19:55:50.484 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:50 compute-0 nova_compute[186662]: 2026-02-19 19:55:50.810 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:55:51 compute-0 nova_compute[186662]: 2026-02-19 19:55:51.349 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:55:51 compute-0 nova_compute[186662]: 2026-02-19 19:55:51.548 186666 DEBUG nova.network.neutron [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Updating instance_info_cache with network_info: [{"id": "929759bd-ecdb-42bf-81b8-d0482da18002", "address": "fa:16:3e:61:d9:90", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap929759bd-ec", "ovs_interfaceid": "929759bd-ecdb-42bf-81b8-d0482da18002", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:55:51 compute-0 nova_compute[186662]: 2026-02-19 19:55:51.826 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.053 186666 DEBUG oslo_concurrency.lockutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-3d326f22-9bd5-4bf2-a7dc-5f425116bfc6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.069 186666 DEBUG nova.virt.libvirt.driver [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpoqb5icxe',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3d326f22-9bd5-4bf2-a7dc-5f425116bfc6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.070 186666 DEBUG nova.virt.libvirt.driver [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Creating instance directory: /var/lib/nova/instances/3d326f22-9bd5-4bf2-a7dc-5f425116bfc6 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.070 186666 DEBUG nova.virt.libvirt.driver [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Creating disk.info with the contents: {'/var/lib/nova/instances/3d326f22-9bd5-4bf2-a7dc-5f425116bfc6/disk': 'qcow2', '/var/lib/nova/instances/3d326f22-9bd5-4bf2-a7dc-5f425116bfc6/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.071 186666 DEBUG nova.virt.libvirt.driver [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.072 186666 DEBUG nova.objects.instance [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.578 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.582 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.585 186666 DEBUG oslo_concurrency.processutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.629 186666 DEBUG oslo_concurrency.processutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.630 186666 DEBUG oslo_concurrency.lockutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.631 186666 DEBUG oslo_concurrency.lockutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.632 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.635 186666 DEBUG oslo_utils.imageutils.format_inspector [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.636 186666 DEBUG oslo_concurrency.processutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.681 186666 DEBUG oslo_concurrency.processutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.682 186666 DEBUG oslo_concurrency.processutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/3d326f22-9bd5-4bf2-a7dc-5f425116bfc6/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.709 186666 DEBUG oslo_concurrency.processutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c,backing_fmt=raw /var/lib/nova/instances/3d326f22-9bd5-4bf2-a7dc-5f425116bfc6/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.710 186666 DEBUG oslo_concurrency.lockutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "dd6239b6d9476b53b0f7aed518607f70b6981c2c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.079s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.711 186666 DEBUG oslo_concurrency.processutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.761 186666 DEBUG oslo_concurrency.processutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd6239b6d9476b53b0f7aed518607f70b6981c2c --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.762 186666 DEBUG nova.virt.disk.api [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Checking if we can resize image /var/lib/nova/instances/3d326f22-9bd5-4bf2-a7dc-5f425116bfc6/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.762 186666 DEBUG oslo_concurrency.processutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d326f22-9bd5-4bf2-a7dc-5f425116bfc6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.830 186666 DEBUG oslo_concurrency.processutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d326f22-9bd5-4bf2-a7dc-5f425116bfc6/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.832 186666 DEBUG nova.virt.disk.api [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Cannot resize image /var/lib/nova/instances/3d326f22-9bd5-4bf2-a7dc-5f425116bfc6/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Feb 19 19:55:52 compute-0 nova_compute[186662]: 2026-02-19 19:55:52.833 186666 DEBUG nova.objects.instance [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lazy-loading 'migration_context' on Instance uuid 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:55:53 compute-0 podman[219237]: 2026-02-19 19:55:53.274794481 +0000 UTC m=+0.052513905 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, version=9.7, maintainer=Red Hat, Inc., distribution-scope=public)
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.341 186666 DEBUG nova.objects.base [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Object Instance<3d326f22-9bd5-4bf2-a7dc-5f425116bfc6> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.341 186666 DEBUG oslo_concurrency.processutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3d326f22-9bd5-4bf2-a7dc-5f425116bfc6/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.358 186666 DEBUG oslo_concurrency.processutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3d326f22-9bd5-4bf2-a7dc-5f425116bfc6/disk.config 497664" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.358 186666 DEBUG nova.virt.libvirt.driver [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.360 186666 DEBUG nova.virt.libvirt.vif [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-02-19T19:54:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1961702911',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1961702911',id=32,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:55:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='dafcf6090521473590cdb432a889739e',ramdisk_id='',reservation_id='r-xsg8xnxp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-81034023',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-19T19:55:08Z,user_data=None,user_id='6f190c8d209a43b19d4cba5936ab90e0',uuid=3d326f22-9bd5-4bf2-a7dc-5f425116bfc6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "929759bd-ecdb-42bf-81b8-d0482da18002", "address": "fa:16:3e:61:d9:90", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap929759bd-ec", "ovs_interfaceid": "929759bd-ecdb-42bf-81b8-d0482da18002", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.360 186666 DEBUG nova.network.os_vif_util [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converting VIF {"id": "929759bd-ecdb-42bf-81b8-d0482da18002", "address": "fa:16:3e:61:d9:90", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap929759bd-ec", "ovs_interfaceid": "929759bd-ecdb-42bf-81b8-d0482da18002", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.361 186666 DEBUG nova.network.os_vif_util [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:d9:90,bridge_name='br-int',has_traffic_filtering=True,id=929759bd-ecdb-42bf-81b8-d0482da18002,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap929759bd-ec') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.361 186666 DEBUG os_vif [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:d9:90,bridge_name='br-int',has_traffic_filtering=True,id=929759bd-ecdb-42bf-81b8-d0482da18002,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap929759bd-ec') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.361 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.362 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.362 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.363 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.363 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '42a8c9d6-4353-55a3-bb68-767be229cd30', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.364 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.365 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.369 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.369 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap929759bd-ec, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.369 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap929759bd-ec, col_values=(('qos', UUID('decab925-06e0-4f24-acf5-4b303c40f6ea')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.370 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap929759bd-ec, col_values=(('external_ids', {'iface-id': '929759bd-ecdb-42bf-81b8-d0482da18002', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:61:d9:90', 'vm-uuid': '3d326f22-9bd5-4bf2-a7dc-5f425116bfc6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.371 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:53 compute-0 NetworkManager[56519]: <info>  [1771530953.3720] manager: (tap929759bd-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.373 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.378 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.378 186666 INFO os_vif [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:d9:90,bridge_name='br-int',has_traffic_filtering=True,id=929759bd-ecdb-42bf-81b8-d0482da18002,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap929759bd-ec')
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.379 186666 DEBUG nova.virt.libvirt.driver [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.379 186666 DEBUG nova.compute.manager [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpoqb5icxe',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3d326f22-9bd5-4bf2-a7dc-5f425116bfc6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.380 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:55:53 compute-0 nova_compute[186662]: 2026-02-19 19:55:53.449 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:55:55 compute-0 podman[219263]: 2026-02-19 19:55:55.283241237 +0000 UTC m=+0.062827365 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:55:56 compute-0 nova_compute[186662]: 2026-02-19 19:55:56.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:55:56 compute-0 nova_compute[186662]: 2026-02-19 19:55:56.576 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Feb 19 19:55:56 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:56.730 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:55:56 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:56.731 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:55:56 compute-0 nova_compute[186662]: 2026-02-19 19:55:56.886 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:57 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:55:57.732 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:55:57 compute-0 ovn_controller[96653]: 2026-02-19T19:55:57Z|00256|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 19 19:55:58 compute-0 podman[219291]: 2026-02-19 19:55:58.277064593 +0000 UTC m=+0.052650298 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 19:55:58 compute-0 nova_compute[186662]: 2026-02-19 19:55:58.372 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:55:59 compute-0 podman[196025]: time="2026-02-19T19:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:55:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:55:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2664 "" "Go-http-client/1.1"
Feb 19 19:56:00 compute-0 nova_compute[186662]: 2026-02-19 19:56:00.081 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:56:00 compute-0 nova_compute[186662]: 2026-02-19 19:56:00.081 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Feb 19 19:56:00 compute-0 nova_compute[186662]: 2026-02-19 19:56:00.587 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Feb 19 19:56:00 compute-0 nova_compute[186662]: 2026-02-19 19:56:00.883 186666 DEBUG nova.network.neutron [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Port 929759bd-ecdb-42bf-81b8-d0482da18002 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Feb 19 19:56:00 compute-0 nova_compute[186662]: 2026-02-19 19:56:00.896 186666 DEBUG nova.compute.manager [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpoqb5icxe',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3d326f22-9bd5-4bf2-a7dc-5f425116bfc6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Feb 19 19:56:01 compute-0 openstack_network_exporter[198916]: ERROR   19:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:56:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:56:01 compute-0 openstack_network_exporter[198916]: ERROR   19:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:56:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:56:01 compute-0 nova_compute[186662]: 2026-02-19 19:56:01.887 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:03 compute-0 nova_compute[186662]: 2026-02-19 19:56:03.374 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:03 compute-0 kernel: tap929759bd-ec: entered promiscuous mode
Feb 19 19:56:03 compute-0 NetworkManager[56519]: <info>  [1771530963.4647] manager: (tap929759bd-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/99)
Feb 19 19:56:03 compute-0 ovn_controller[96653]: 2026-02-19T19:56:03Z|00257|binding|INFO|Claiming lport 929759bd-ecdb-42bf-81b8-d0482da18002 for this additional chassis.
Feb 19 19:56:03 compute-0 ovn_controller[96653]: 2026-02-19T19:56:03Z|00258|binding|INFO|929759bd-ecdb-42bf-81b8-d0482da18002: Claiming fa:16:3e:61:d9:90 10.100.0.8
Feb 19 19:56:03 compute-0 nova_compute[186662]: 2026-02-19 19:56:03.467 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.472 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:d9:90 10.100.0.8'], port_security=['fa:16:3e:61:d9:90 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3d326f22-9bd5-4bf2-a7dc-5f425116bfc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dafcf6090521473590cdb432a889739e', 'neutron:revision_number': '10', 'neutron:security_group_ids': '8f4a8406-fbc3-4742-80fc-e9181ef138d4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ddef6f8-604e-4b41-9f83-1182f76298b6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=929759bd-ecdb-42bf-81b8-d0482da18002) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.473 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 929759bd-ecdb-42bf-81b8-d0482da18002 in datapath 2cb93231-0e3e-4efd-8b0c-4366500dbd16 unbound from our chassis
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.474 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2cb93231-0e3e-4efd-8b0c-4366500dbd16
Feb 19 19:56:03 compute-0 ovn_controller[96653]: 2026-02-19T19:56:03Z|00259|binding|INFO|Setting lport 929759bd-ecdb-42bf-81b8-d0482da18002 ovn-installed in OVS
Feb 19 19:56:03 compute-0 nova_compute[186662]: 2026-02-19 19:56:03.474 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:03 compute-0 nova_compute[186662]: 2026-02-19 19:56:03.476 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.486 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[42395ee7-0b99-41b5-b69c-1b30da71aac8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:03 compute-0 systemd-udevd[219331]: Network interface NamePolicy= disabled on kernel command line.
Feb 19 19:56:03 compute-0 systemd-machined[156014]: New machine qemu-25-instance-00000020.
Feb 19 19:56:03 compute-0 NetworkManager[56519]: <info>  [1771530963.5006] device (tap929759bd-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 19 19:56:03 compute-0 NetworkManager[56519]: <info>  [1771530963.5012] device (tap929759bd-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 19 19:56:03 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-00000020.
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.509 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[cf6f6308-31a3-45dd-964d-bddcf474a11d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.512 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[cf3c3184-51a5-40b9-b116-aae14018e7af]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.532 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[e6e2d8d9-bb51-4780-91d3-6f10a6ddfe5e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.543 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[dfcb00aa-55ec-4e6b-b77f-8d351efc4a0f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2cb93231-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:50:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519756, 'reachable_time': 18572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219343, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.555 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a637edd6-8c92-4498-a50f-82cf9a288556]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2cb93231-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519767, 'tstamp': 519767}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219345, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2cb93231-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519770, 'tstamp': 519770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219345, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.557 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cb93231-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:56:03 compute-0 nova_compute[186662]: 2026-02-19 19:56:03.559 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.559 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2cb93231-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.559 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.560 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2cb93231-00, col_values=(('external_ids', {'iface-id': '4d5db50f-b83e-4965-bddf-4fd62a569e63'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.560 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:56:03 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:03.561 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[bbfdac29-6c38-4593-83a1-0316ce0a2eb8]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-2cb93231-0e3e-4efd-8b0c-4366500dbd16\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 2cb93231-0e3e-4efd-8b0c-4366500dbd16\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:05 compute-0 ovn_controller[96653]: 2026-02-19T19:56:05Z|00260|binding|INFO|Claiming lport 929759bd-ecdb-42bf-81b8-d0482da18002 for this chassis.
Feb 19 19:56:05 compute-0 ovn_controller[96653]: 2026-02-19T19:56:05Z|00261|binding|INFO|929759bd-ecdb-42bf-81b8-d0482da18002: Claiming fa:16:3e:61:d9:90 10.100.0.8
Feb 19 19:56:05 compute-0 ovn_controller[96653]: 2026-02-19T19:56:05Z|00262|binding|INFO|Setting lport 929759bd-ecdb-42bf-81b8-d0482da18002 up in Southbound
Feb 19 19:56:06 compute-0 nova_compute[186662]: 2026-02-19 19:56:06.852 186666 INFO nova.compute.manager [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Post operation of migration started
Feb 19 19:56:06 compute-0 nova_compute[186662]: 2026-02-19 19:56:06.853 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:56:06 compute-0 nova_compute[186662]: 2026-02-19 19:56:06.890 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:06 compute-0 nova_compute[186662]: 2026-02-19 19:56:06.951 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:56:06 compute-0 nova_compute[186662]: 2026-02-19 19:56:06.951 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:56:07 compute-0 nova_compute[186662]: 2026-02-19 19:56:07.749 186666 DEBUG oslo_concurrency.lockutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "refresh_cache-3d326f22-9bd5-4bf2-a7dc-5f425116bfc6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Feb 19 19:56:07 compute-0 nova_compute[186662]: 2026-02-19 19:56:07.750 186666 DEBUG oslo_concurrency.lockutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquired lock "refresh_cache-3d326f22-9bd5-4bf2-a7dc-5f425116bfc6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Feb 19 19:56:07 compute-0 nova_compute[186662]: 2026-02-19 19:56:07.750 186666 DEBUG nova.network.neutron [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Feb 19 19:56:08 compute-0 nova_compute[186662]: 2026-02-19 19:56:08.255 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:56:08 compute-0 nova_compute[186662]: 2026-02-19 19:56:08.377 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:09 compute-0 nova_compute[186662]: 2026-02-19 19:56:09.797 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:56:09 compute-0 nova_compute[186662]: 2026-02-19 19:56:09.986 186666 DEBUG nova.network.neutron [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Updating instance_info_cache with network_info: [{"id": "929759bd-ecdb-42bf-81b8-d0482da18002", "address": "fa:16:3e:61:d9:90", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap929759bd-ec", "ovs_interfaceid": "929759bd-ecdb-42bf-81b8-d0482da18002", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:56:10 compute-0 nova_compute[186662]: 2026-02-19 19:56:10.493 186666 DEBUG oslo_concurrency.lockutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Releasing lock "refresh_cache-3d326f22-9bd5-4bf2-a7dc-5f425116bfc6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Feb 19 19:56:11 compute-0 nova_compute[186662]: 2026-02-19 19:56:11.048 186666 DEBUG oslo_concurrency.lockutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:11 compute-0 nova_compute[186662]: 2026-02-19 19:56:11.049 186666 DEBUG oslo_concurrency.lockutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:11 compute-0 nova_compute[186662]: 2026-02-19 19:56:11.049 186666 DEBUG oslo_concurrency.lockutils [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:11 compute-0 nova_compute[186662]: 2026-02-19 19:56:11.054 186666 INFO nova.virt.libvirt.driver [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 19 19:56:11 compute-0 virtqemud[186157]: Domain id=25 name='instance-00000020' uuid=3d326f22-9bd5-4bf2-a7dc-5f425116bfc6 is tainted: custom-monitor
Feb 19 19:56:11 compute-0 nova_compute[186662]: 2026-02-19 19:56:11.891 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:12 compute-0 nova_compute[186662]: 2026-02-19 19:56:12.062 186666 INFO nova.virt.libvirt.driver [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 19 19:56:13 compute-0 nova_compute[186662]: 2026-02-19 19:56:13.067 186666 INFO nova.virt.libvirt.driver [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 19 19:56:13 compute-0 nova_compute[186662]: 2026-02-19 19:56:13.071 186666 DEBUG nova.compute.manager [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Feb 19 19:56:13 compute-0 nova_compute[186662]: 2026-02-19 19:56:13.380 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:13 compute-0 nova_compute[186662]: 2026-02-19 19:56:13.579 186666 DEBUG nova.objects.instance [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Feb 19 19:56:14 compute-0 nova_compute[186662]: 2026-02-19 19:56:14.597 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:56:14 compute-0 nova_compute[186662]: 2026-02-19 19:56:14.751 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:56:14 compute-0 nova_compute[186662]: 2026-02-19 19:56:14.752 186666 WARNING neutronclient.v2_0.client [None req-e834d66e-167c-4625-8c26-1bde15fd7722 9d4592bf29644d399d63bc5c5c5d3756 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:56:16 compute-0 nova_compute[186662]: 2026-02-19 19:56:16.896 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:18 compute-0 nova_compute[186662]: 2026-02-19 19:56:18.386 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:19 compute-0 podman[219364]: 2026-02-19 19:56:19.271949534 +0000 UTC m=+0.046501228 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 19 19:56:21 compute-0 nova_compute[186662]: 2026-02-19 19:56:21.900 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:23 compute-0 nova_compute[186662]: 2026-02-19 19:56:23.389 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:24 compute-0 podman[219383]: 2026-02-19 19:56:24.307338599 +0000 UTC m=+0.084360427 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., release=1770267347, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, container_name=openstack_network_exporter)
Feb 19 19:56:26 compute-0 podman[219405]: 2026-02-19 19:56:26.353277665 +0000 UTC m=+0.132375172 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:56:26 compute-0 nova_compute[186662]: 2026-02-19 19:56:26.484 186666 DEBUG oslo_concurrency.lockutils [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "15173393-c568-47d6-8cec-a82e80673bb2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:26 compute-0 nova_compute[186662]: 2026-02-19 19:56:26.484 186666 DEBUG oslo_concurrency.lockutils [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:26 compute-0 nova_compute[186662]: 2026-02-19 19:56:26.484 186666 DEBUG oslo_concurrency.lockutils [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "15173393-c568-47d6-8cec-a82e80673bb2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:26 compute-0 nova_compute[186662]: 2026-02-19 19:56:26.484 186666 DEBUG oslo_concurrency.lockutils [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:26 compute-0 nova_compute[186662]: 2026-02-19 19:56:26.485 186666 DEBUG oslo_concurrency.lockutils [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:26 compute-0 nova_compute[186662]: 2026-02-19 19:56:26.497 186666 INFO nova.compute.manager [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Terminating instance
Feb 19 19:56:26 compute-0 nova_compute[186662]: 2026-02-19 19:56:26.903 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.013 186666 DEBUG nova.compute.manager [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:56:27 compute-0 kernel: tap0fa7e740-cb (unregistering): left promiscuous mode
Feb 19 19:56:27 compute-0 NetworkManager[56519]: <info>  [1771530987.0379] device (tap0fa7e740-cb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:56:27 compute-0 ovn_controller[96653]: 2026-02-19T19:56:27Z|00263|binding|INFO|Releasing lport 0fa7e740-cbda-4159-98cf-b9a37242e117 from this chassis (sb_readonly=0)
Feb 19 19:56:27 compute-0 ovn_controller[96653]: 2026-02-19T19:56:27Z|00264|binding|INFO|Setting lport 0fa7e740-cbda-4159-98cf-b9a37242e117 down in Southbound
Feb 19 19:56:27 compute-0 ovn_controller[96653]: 2026-02-19T19:56:27Z|00265|binding|INFO|Removing iface tap0fa7e740-cb ovn-installed in OVS
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.043 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.046 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.052 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:86:c3 10.100.0.9'], port_security=['fa:16:3e:32:86:c3 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '15173393-c568-47d6-8cec-a82e80673bb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dafcf6090521473590cdb432a889739e', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8f4a8406-fbc3-4742-80fc-e9181ef138d4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ddef6f8-604e-4b41-9f83-1182f76298b6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=0fa7e740-cbda-4159-98cf-b9a37242e117) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.053 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 0fa7e740-cbda-4159-98cf-b9a37242e117 in datapath 2cb93231-0e3e-4efd-8b0c-4366500dbd16 unbound from our chassis
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.055 105986 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2cb93231-0e3e-4efd-8b0c-4366500dbd16
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.056 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.067 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf7d0c2-e5fc-44e6-a419-30c2b18b4a26]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.088 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[c05f4b55-2c94-44db-b6a5-f45bd1cd16d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.090 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[ac65318c-163f-42ea-aee7-58aaa12d0aeb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:27 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000021.scope: Deactivated successfully.
Feb 19 19:56:27 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000021.scope: Consumed 14.011s CPU time.
Feb 19 19:56:27 compute-0 systemd-machined[156014]: Machine qemu-24-instance-00000021 terminated.
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.111 208593 DEBUG oslo.privsep.daemon [-] privsep: reply[356c90dc-4772-4e12-9fce-3101b4220e00]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.124 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[93891720-5203-40f3-802e-7e1d0841ef91]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2cb93231-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:50:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519756, 'reachable_time': 18572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219447, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.135 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[7772bc61-34f2-447d-a650-2e025af91e85]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2cb93231-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519767, 'tstamp': 519767}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219448, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2cb93231-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519770, 'tstamp': 519770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219448, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.136 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cb93231-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.138 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.141 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.141 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2cb93231-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.142 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.142 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2cb93231-00, col_values=(('external_ids', {'iface-id': '4d5db50f-b83e-4965-bddf-4fd62a569e63'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.142 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 19 19:56:27 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:27.143 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[69860c48-d354-40ea-91ac-ca2d05c0f949]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-2cb93231-0e3e-4efd-8b0c-4366500dbd16\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 2cb93231-0e3e-4efd-8b0c-4366500dbd16\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.181 186666 DEBUG nova.compute.manager [req-9eea81cc-c758-4c5f-924f-2635f32ce025 req-1245172e-52da-4144-b89f-751530f60185 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Received event network-vif-unplugged-0fa7e740-cbda-4159-98cf-b9a37242e117 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.181 186666 DEBUG oslo_concurrency.lockutils [req-9eea81cc-c758-4c5f-924f-2635f32ce025 req-1245172e-52da-4144-b89f-751530f60185 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "15173393-c568-47d6-8cec-a82e80673bb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.182 186666 DEBUG oslo_concurrency.lockutils [req-9eea81cc-c758-4c5f-924f-2635f32ce025 req-1245172e-52da-4144-b89f-751530f60185 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.182 186666 DEBUG oslo_concurrency.lockutils [req-9eea81cc-c758-4c5f-924f-2635f32ce025 req-1245172e-52da-4144-b89f-751530f60185 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.182 186666 DEBUG nova.compute.manager [req-9eea81cc-c758-4c5f-924f-2635f32ce025 req-1245172e-52da-4144-b89f-751530f60185 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] No waiting events found dispatching network-vif-unplugged-0fa7e740-cbda-4159-98cf-b9a37242e117 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.182 186666 DEBUG nova.compute.manager [req-9eea81cc-c758-4c5f-924f-2635f32ce025 req-1245172e-52da-4144-b89f-751530f60185 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Received event network-vif-unplugged-0fa7e740-cbda-4159-98cf-b9a37242e117 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.261 186666 INFO nova.virt.libvirt.driver [-] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Instance destroyed successfully.
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.262 186666 DEBUG nova.objects.instance [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lazy-loading 'resources' on Instance uuid 15173393-c568-47d6-8cec-a82e80673bb2 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.770 186666 DEBUG nova.virt.libvirt.vif [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-02-19T19:55:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1139573967',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1139573967',id=33,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:55:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dafcf6090521473590cdb432a889739e',ramdisk_id='',reservation_id='r-4ldo001t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-81034023',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:55:29Z,user_data=None,user_id='6f190c8d209a43b19d4cba5936ab90e0',uuid=15173393-c568-47d6-8cec-a82e80673bb2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0fa7e740-cbda-4159-98cf-b9a37242e117", "address": "fa:16:3e:32:86:c3", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fa7e740-cb", "ovs_interfaceid": "0fa7e740-cbda-4159-98cf-b9a37242e117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.771 186666 DEBUG nova.network.os_vif_util [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converting VIF {"id": "0fa7e740-cbda-4159-98cf-b9a37242e117", "address": "fa:16:3e:32:86:c3", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fa7e740-cb", "ovs_interfaceid": "0fa7e740-cbda-4159-98cf-b9a37242e117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.771 186666 DEBUG nova.network.os_vif_util [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:86:c3,bridge_name='br-int',has_traffic_filtering=True,id=0fa7e740-cbda-4159-98cf-b9a37242e117,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fa7e740-cb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.772 186666 DEBUG os_vif [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:86:c3,bridge_name='br-int',has_traffic_filtering=True,id=0fa7e740-cbda-4159-98cf-b9a37242e117,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fa7e740-cb') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.773 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.774 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0fa7e740-cb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.775 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.776 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.777 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.777 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a4db38e5-aecb-44eb-be97-8fbc5cd8dbec) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.778 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.779 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.781 186666 INFO os_vif [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:86:c3,bridge_name='br-int',has_traffic_filtering=True,id=0fa7e740-cbda-4159-98cf-b9a37242e117,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fa7e740-cb')
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.781 186666 INFO nova.virt.libvirt.driver [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Deleting instance files /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2_del
Feb 19 19:56:27 compute-0 nova_compute[186662]: 2026-02-19 19:56:27.782 186666 INFO nova.virt.libvirt.driver [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Deletion of /var/lib/nova/instances/15173393-c568-47d6-8cec-a82e80673bb2_del complete
Feb 19 19:56:28 compute-0 nova_compute[186662]: 2026-02-19 19:56:28.294 186666 INFO nova.compute.manager [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Took 1.28 seconds to destroy the instance on the hypervisor.
Feb 19 19:56:28 compute-0 nova_compute[186662]: 2026-02-19 19:56:28.295 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:56:28 compute-0 nova_compute[186662]: 2026-02-19 19:56:28.295 186666 DEBUG nova.compute.manager [-] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:56:28 compute-0 nova_compute[186662]: 2026-02-19 19:56:28.295 186666 DEBUG nova.network.neutron [-] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:56:28 compute-0 nova_compute[186662]: 2026-02-19 19:56:28.296 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:56:28 compute-0 nova_compute[186662]: 2026-02-19 19:56:28.764 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:56:29 compute-0 nova_compute[186662]: 2026-02-19 19:56:29.239 186666 DEBUG nova.compute.manager [req-625ab19d-8852-4765-ba3e-964af3e380b7 req-feba6674-02a1-4694-a78d-926b8bf50930 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Received event network-vif-unplugged-0fa7e740-cbda-4159-98cf-b9a37242e117 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:56:29 compute-0 nova_compute[186662]: 2026-02-19 19:56:29.240 186666 DEBUG oslo_concurrency.lockutils [req-625ab19d-8852-4765-ba3e-964af3e380b7 req-feba6674-02a1-4694-a78d-926b8bf50930 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "15173393-c568-47d6-8cec-a82e80673bb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:29 compute-0 nova_compute[186662]: 2026-02-19 19:56:29.240 186666 DEBUG oslo_concurrency.lockutils [req-625ab19d-8852-4765-ba3e-964af3e380b7 req-feba6674-02a1-4694-a78d-926b8bf50930 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:29 compute-0 nova_compute[186662]: 2026-02-19 19:56:29.241 186666 DEBUG oslo_concurrency.lockutils [req-625ab19d-8852-4765-ba3e-964af3e380b7 req-feba6674-02a1-4694-a78d-926b8bf50930 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:29 compute-0 nova_compute[186662]: 2026-02-19 19:56:29.241 186666 DEBUG nova.compute.manager [req-625ab19d-8852-4765-ba3e-964af3e380b7 req-feba6674-02a1-4694-a78d-926b8bf50930 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] No waiting events found dispatching network-vif-unplugged-0fa7e740-cbda-4159-98cf-b9a37242e117 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:56:29 compute-0 nova_compute[186662]: 2026-02-19 19:56:29.241 186666 DEBUG nova.compute.manager [req-625ab19d-8852-4765-ba3e-964af3e380b7 req-feba6674-02a1-4694-a78d-926b8bf50930 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Received event network-vif-unplugged-0fa7e740-cbda-4159-98cf-b9a37242e117 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:56:29 compute-0 nova_compute[186662]: 2026-02-19 19:56:29.254 186666 DEBUG nova.compute.manager [req-7584ca15-578b-4238-b2b7-2c904443e68f req-d2404c26-baaf-4a06-b097-5bacdc2dc8a1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Received event network-vif-deleted-0fa7e740-cbda-4159-98cf-b9a37242e117 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:56:29 compute-0 nova_compute[186662]: 2026-02-19 19:56:29.254 186666 INFO nova.compute.manager [req-7584ca15-578b-4238-b2b7-2c904443e68f req-d2404c26-baaf-4a06-b097-5bacdc2dc8a1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Neutron deleted interface 0fa7e740-cbda-4159-98cf-b9a37242e117; detaching it from the instance and deleting it from the info cache
Feb 19 19:56:29 compute-0 nova_compute[186662]: 2026-02-19 19:56:29.255 186666 DEBUG nova.network.neutron [req-7584ca15-578b-4238-b2b7-2c904443e68f req-d2404c26-baaf-4a06-b097-5bacdc2dc8a1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:56:29 compute-0 podman[219467]: 2026-02-19 19:56:29.296517004 +0000 UTC m=+0.064652690 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:56:29 compute-0 nova_compute[186662]: 2026-02-19 19:56:29.698 186666 DEBUG nova.network.neutron [-] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:56:29 compute-0 podman[196025]: time="2026-02-19T19:56:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:56:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:56:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1"
Feb 19 19:56:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:56:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2665 "" "Go-http-client/1.1"
Feb 19 19:56:29 compute-0 nova_compute[186662]: 2026-02-19 19:56:29.764 186666 DEBUG nova.compute.manager [req-7584ca15-578b-4238-b2b7-2c904443e68f req-d2404c26-baaf-4a06-b097-5bacdc2dc8a1 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Detach interface failed, port_id=0fa7e740-cbda-4159-98cf-b9a37242e117, reason: Instance 15173393-c568-47d6-8cec-a82e80673bb2 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:56:30 compute-0 nova_compute[186662]: 2026-02-19 19:56:30.205 186666 INFO nova.compute.manager [-] [instance: 15173393-c568-47d6-8cec-a82e80673bb2] Took 1.91 seconds to deallocate network for instance.
Feb 19 19:56:30 compute-0 nova_compute[186662]: 2026-02-19 19:56:30.724 186666 DEBUG oslo_concurrency.lockutils [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:30 compute-0 nova_compute[186662]: 2026-02-19 19:56:30.724 186666 DEBUG oslo_concurrency.lockutils [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:30 compute-0 nova_compute[186662]: 2026-02-19 19:56:30.815 186666 DEBUG nova.compute.provider_tree [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:56:31 compute-0 nova_compute[186662]: 2026-02-19 19:56:31.321 186666 DEBUG nova.scheduler.client.report [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:56:31 compute-0 openstack_network_exporter[198916]: ERROR   19:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:56:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:56:31 compute-0 openstack_network_exporter[198916]: ERROR   19:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:56:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:56:31 compute-0 nova_compute[186662]: 2026-02-19 19:56:31.839 186666 DEBUG oslo_concurrency.lockutils [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:31 compute-0 nova_compute[186662]: 2026-02-19 19:56:31.859 186666 INFO nova.scheduler.client.report [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Deleted allocations for instance 15173393-c568-47d6-8cec-a82e80673bb2
Feb 19 19:56:31 compute-0 nova_compute[186662]: 2026-02-19 19:56:31.905 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:32.170 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:32.170 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:32.170 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:32 compute-0 nova_compute[186662]: 2026-02-19 19:56:32.778 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:32 compute-0 nova_compute[186662]: 2026-02-19 19:56:32.884 186666 DEBUG oslo_concurrency.lockutils [None req-2fed8076-ba05-4eb6-8d88-7509d1752291 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "15173393-c568-47d6-8cec-a82e80673bb2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.400s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:33 compute-0 nova_compute[186662]: 2026-02-19 19:56:33.082 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:56:34 compute-0 nova_compute[186662]: 2026-02-19 19:56:34.489 186666 DEBUG oslo_concurrency.lockutils [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "3d326f22-9bd5-4bf2-a7dc-5f425116bfc6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:34 compute-0 nova_compute[186662]: 2026-02-19 19:56:34.489 186666 DEBUG oslo_concurrency.lockutils [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "3d326f22-9bd5-4bf2-a7dc-5f425116bfc6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:34 compute-0 nova_compute[186662]: 2026-02-19 19:56:34.490 186666 DEBUG oslo_concurrency.lockutils [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "3d326f22-9bd5-4bf2-a7dc-5f425116bfc6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:34 compute-0 nova_compute[186662]: 2026-02-19 19:56:34.490 186666 DEBUG oslo_concurrency.lockutils [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "3d326f22-9bd5-4bf2-a7dc-5f425116bfc6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:34 compute-0 nova_compute[186662]: 2026-02-19 19:56:34.490 186666 DEBUG oslo_concurrency.lockutils [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "3d326f22-9bd5-4bf2-a7dc-5f425116bfc6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:34 compute-0 nova_compute[186662]: 2026-02-19 19:56:34.515 186666 INFO nova.compute.manager [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Terminating instance
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.030 186666 DEBUG nova.compute.manager [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Feb 19 19:56:35 compute-0 kernel: tap929759bd-ec (unregistering): left promiscuous mode
Feb 19 19:56:35 compute-0 NetworkManager[56519]: <info>  [1771530995.0542] device (tap929759bd-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.057 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:35 compute-0 ovn_controller[96653]: 2026-02-19T19:56:35Z|00266|binding|INFO|Releasing lport 929759bd-ecdb-42bf-81b8-d0482da18002 from this chassis (sb_readonly=0)
Feb 19 19:56:35 compute-0 ovn_controller[96653]: 2026-02-19T19:56:35Z|00267|binding|INFO|Setting lport 929759bd-ecdb-42bf-81b8-d0482da18002 down in Southbound
Feb 19 19:56:35 compute-0 ovn_controller[96653]: 2026-02-19T19:56:35Z|00268|binding|INFO|Removing iface tap929759bd-ec ovn-installed in OVS
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.060 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.063 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.067 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:d9:90 10.100.0.8'], port_security=['fa:16:3e:61:d9:90 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3d326f22-9bd5-4bf2-a7dc-5f425116bfc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dafcf6090521473590cdb432a889739e', 'neutron:revision_number': '15', 'neutron:security_group_ids': '8f4a8406-fbc3-4742-80fc-e9181ef138d4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ddef6f8-604e-4b41-9f83-1182f76298b6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>], logical_port=929759bd-ecdb-42bf-81b8-d0482da18002) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f468dd22600>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.069 105986 INFO neutron.agent.ovn.metadata.agent [-] Port 929759bd-ecdb-42bf-81b8-d0482da18002 in datapath 2cb93231-0e3e-4efd-8b0c-4366500dbd16 unbound from our chassis
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.070 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2cb93231-0e3e-4efd-8b0c-4366500dbd16, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.071 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[72868d66-0433-48ca-8461-26fd306f581d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.072 105986 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16 namespace which is not needed anymore
Feb 19 19:56:35 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000020.scope: Deactivated successfully.
Feb 19 19:56:35 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000020.scope: Consumed 1.980s CPU time.
Feb 19 19:56:35 compute-0 systemd-machined[156014]: Machine qemu-25-instance-00000020 terminated.
Feb 19 19:56:35 compute-0 podman[219517]: 2026-02-19 19:56:35.172008568 +0000 UTC m=+0.026050463 container kill 312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:56:35 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[219156]: [NOTICE]   (219160) : haproxy version is 3.0.5-8e879a5
Feb 19 19:56:35 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[219156]: [NOTICE]   (219160) : path to executable is /usr/sbin/haproxy
Feb 19 19:56:35 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[219156]: [WARNING]  (219160) : Exiting Master process...
Feb 19 19:56:35 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[219156]: [ALERT]    (219160) : Current worker (219162) exited with code 143 (Terminated)
Feb 19 19:56:35 compute-0 neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16[219156]: [WARNING]  (219160) : All workers exited. Exiting... (0)
Feb 19 19:56:35 compute-0 systemd[1]: libpod-312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d.scope: Deactivated successfully.
Feb 19 19:56:35 compute-0 podman[219532]: 2026-02-19 19:56:35.208846171 +0000 UTC m=+0.020773085 container died 312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.210 186666 DEBUG nova.compute.manager [req-dfb55ccf-3bb6-4e63-bd06-61e0ae59d371 req-0e2413b3-1e13-4f29-a1ad-7161b45b55a6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Received event network-vif-unplugged-929759bd-ecdb-42bf-81b8-d0482da18002 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.210 186666 DEBUG oslo_concurrency.lockutils [req-dfb55ccf-3bb6-4e63-bd06-61e0ae59d371 req-0e2413b3-1e13-4f29-a1ad-7161b45b55a6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "3d326f22-9bd5-4bf2-a7dc-5f425116bfc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.210 186666 DEBUG oslo_concurrency.lockutils [req-dfb55ccf-3bb6-4e63-bd06-61e0ae59d371 req-0e2413b3-1e13-4f29-a1ad-7161b45b55a6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3d326f22-9bd5-4bf2-a7dc-5f425116bfc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.211 186666 DEBUG oslo_concurrency.lockutils [req-dfb55ccf-3bb6-4e63-bd06-61e0ae59d371 req-0e2413b3-1e13-4f29-a1ad-7161b45b55a6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3d326f22-9bd5-4bf2-a7dc-5f425116bfc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.211 186666 DEBUG nova.compute.manager [req-dfb55ccf-3bb6-4e63-bd06-61e0ae59d371 req-0e2413b3-1e13-4f29-a1ad-7161b45b55a6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] No waiting events found dispatching network-vif-unplugged-929759bd-ecdb-42bf-81b8-d0482da18002 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:56:35 compute-0 rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.211 186666 DEBUG nova.compute.manager [req-dfb55ccf-3bb6-4e63-bd06-61e0ae59d371 req-0e2413b3-1e13-4f29-a1ad-7161b45b55a6 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Received event network-vif-unplugged-929759bd-ecdb-42bf-81b8-d0482da18002 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:56:35 compute-0 rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 19:56:35 compute-0 rsyslogd[1019]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 19 19:56:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d-userdata-shm.mount: Deactivated successfully.
Feb 19 19:56:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-b08b39043be9c560e4dc135cd208f5c8aec6f7ae4db1699aa058823ce629f67b-merged.mount: Deactivated successfully.
Feb 19 19:56:35 compute-0 podman[219532]: 2026-02-19 19:56:35.243858221 +0000 UTC m=+0.055785125 container cleanup 312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:56:35 compute-0 systemd[1]: libpod-conmon-312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d.scope: Deactivated successfully.
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.252 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.256 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:35 compute-0 podman[219534]: 2026-02-19 19:56:35.259849979 +0000 UTC m=+0.067847557 container remove 312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.266 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ba924a-d8d8-45ac-a52e-d457f8a07c8b]: (4, ("Thu Feb 19 07:56:35 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16 (312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d)\n312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d\nThu Feb 19 07:56:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16 (312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d)\n312db5c79a732c985afeb526f188ddcd6f4d96fb27fe50e49f1b9134c61cdb5d\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.268 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4cf39e-f010-40b9-ac62-25f0ef029561]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.269 105986 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2cb93231-0e3e-4efd-8b0c-4366500dbd16.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.270 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[a432bb22-640a-486b-a212-0b28f5ab1c86]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.270 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cb93231-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.272 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:35 compute-0 kernel: tap2cb93231-00: left promiscuous mode
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.280 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.282 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[c07f78b5-02c2-4ff7-8bbe-420f78a3b9b7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.294 186666 INFO nova.virt.libvirt.driver [-] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Instance destroyed successfully.
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.295 186666 DEBUG nova.objects.instance [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lazy-loading 'resources' on Instance uuid 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.295 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[776f4a64-8878-4352-8470-b39d8b52ab7d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.295 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[cd3fd3a7-a5f3-4172-9823-dde5e1d908e1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.306 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[2fba3d84-d21d-413e-a016-4bdbf60f451f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519750, 'reachable_time': 34741, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219582, 'error': None, 'target': 'ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:35 compute-0 systemd[1]: run-netns-ovnmeta\x2d2cb93231\x2d0e3e\x2d4efd\x2d8b0c\x2d4366500dbd16.mount: Deactivated successfully.
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.308 106358 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2cb93231-0e3e-4efd-8b0c-4366500dbd16 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Feb 19 19:56:35 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:35.308 106358 DEBUG oslo.privsep.daemon [-] privsep: reply[c5aac082-7d2b-4ff5-9a62-dc88317a5951]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.801 186666 DEBUG nova.virt.libvirt.vif [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2026-02-19T19:54:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1961702911',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1961702911',id=32,image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-19T19:55:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dafcf6090521473590cdb432a889739e',ramdisk_id='',reservation_id='r-xsg8xnxp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='b8007ea6-afa7-4c5a-abc0-d9d7338ce087',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-81034023',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-81034023-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-19T19:56:14Z,user_data=None,user_id='6f190c8d209a43b19d4cba5936ab90e0',uuid=3d326f22-9bd5-4bf2-a7dc-5f425116bfc6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "929759bd-ecdb-42bf-81b8-d0482da18002", "address": "fa:16:3e:61:d9:90", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap929759bd-ec", "ovs_interfaceid": "929759bd-ecdb-42bf-81b8-d0482da18002", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.801 186666 DEBUG nova.network.os_vif_util [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converting VIF {"id": "929759bd-ecdb-42bf-81b8-d0482da18002", "address": "fa:16:3e:61:d9:90", "network": {"id": "2cb93231-0e3e-4efd-8b0c-4366500dbd16", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-492310625-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4e2c7ed37304bc098ed5d11e1ab4d17", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap929759bd-ec", "ovs_interfaceid": "929759bd-ecdb-42bf-81b8-d0482da18002", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.801 186666 DEBUG nova.network.os_vif_util [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:61:d9:90,bridge_name='br-int',has_traffic_filtering=True,id=929759bd-ecdb-42bf-81b8-d0482da18002,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap929759bd-ec') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.802 186666 DEBUG os_vif [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:61:d9:90,bridge_name='br-int',has_traffic_filtering=True,id=929759bd-ecdb-42bf-81b8-d0482da18002,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap929759bd-ec') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.803 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.803 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap929759bd-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.805 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.807 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.808 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.808 186666 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=decab925-06e0-4f24-acf5-4b303c40f6ea) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.809 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.812 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.813 186666 INFO os_vif [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:61:d9:90,bridge_name='br-int',has_traffic_filtering=True,id=929759bd-ecdb-42bf-81b8-d0482da18002,network=Network(2cb93231-0e3e-4efd-8b0c-4366500dbd16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap929759bd-ec')
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.814 186666 INFO nova.virt.libvirt.driver [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Deleting instance files /var/lib/nova/instances/3d326f22-9bd5-4bf2-a7dc-5f425116bfc6_del
Feb 19 19:56:35 compute-0 nova_compute[186662]: 2026-02-19 19:56:35.815 186666 INFO nova.virt.libvirt.driver [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Deletion of /var/lib/nova/instances/3d326f22-9bd5-4bf2-a7dc-5f425116bfc6_del complete
Feb 19 19:56:36 compute-0 nova_compute[186662]: 2026-02-19 19:56:36.329 186666 INFO nova.compute.manager [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Took 1.30 seconds to destroy the instance on the hypervisor.
Feb 19 19:56:36 compute-0 nova_compute[186662]: 2026-02-19 19:56:36.330 186666 DEBUG oslo.service.backend._eventlet.loopingcall [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Feb 19 19:56:36 compute-0 nova_compute[186662]: 2026-02-19 19:56:36.330 186666 DEBUG nova.compute.manager [-] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Feb 19 19:56:36 compute-0 nova_compute[186662]: 2026-02-19 19:56:36.330 186666 DEBUG nova.network.neutron [-] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Feb 19 19:56:36 compute-0 nova_compute[186662]: 2026-02-19 19:56:36.331 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:56:36 compute-0 nova_compute[186662]: 2026-02-19 19:56:36.746 186666 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Feb 19 19:56:36 compute-0 nova_compute[186662]: 2026-02-19 19:56:36.907 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:37 compute-0 nova_compute[186662]: 2026-02-19 19:56:37.261 186666 DEBUG nova.compute.manager [req-789ae73e-a61d-434c-b82d-2e42c3cdda80 req-a8d73ac4-4a07-4884-941b-bad17f6b6487 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Received event network-vif-unplugged-929759bd-ecdb-42bf-81b8-d0482da18002 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:56:37 compute-0 nova_compute[186662]: 2026-02-19 19:56:37.261 186666 DEBUG oslo_concurrency.lockutils [req-789ae73e-a61d-434c-b82d-2e42c3cdda80 req-a8d73ac4-4a07-4884-941b-bad17f6b6487 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Acquiring lock "3d326f22-9bd5-4bf2-a7dc-5f425116bfc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:37 compute-0 nova_compute[186662]: 2026-02-19 19:56:37.261 186666 DEBUG oslo_concurrency.lockutils [req-789ae73e-a61d-434c-b82d-2e42c3cdda80 req-a8d73ac4-4a07-4884-941b-bad17f6b6487 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3d326f22-9bd5-4bf2-a7dc-5f425116bfc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:37 compute-0 nova_compute[186662]: 2026-02-19 19:56:37.261 186666 DEBUG oslo_concurrency.lockutils [req-789ae73e-a61d-434c-b82d-2e42c3cdda80 req-a8d73ac4-4a07-4884-941b-bad17f6b6487 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] Lock "3d326f22-9bd5-4bf2-a7dc-5f425116bfc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:37 compute-0 nova_compute[186662]: 2026-02-19 19:56:37.262 186666 DEBUG nova.compute.manager [req-789ae73e-a61d-434c-b82d-2e42c3cdda80 req-a8d73ac4-4a07-4884-941b-bad17f6b6487 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] No waiting events found dispatching network-vif-unplugged-929759bd-ecdb-42bf-81b8-d0482da18002 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Feb 19 19:56:37 compute-0 nova_compute[186662]: 2026-02-19 19:56:37.262 186666 DEBUG nova.compute.manager [req-789ae73e-a61d-434c-b82d-2e42c3cdda80 req-a8d73ac4-4a07-4884-941b-bad17f6b6487 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Received event network-vif-unplugged-929759bd-ecdb-42bf-81b8-d0482da18002 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Feb 19 19:56:37 compute-0 nova_compute[186662]: 2026-02-19 19:56:37.262 186666 DEBUG nova.compute.manager [req-789ae73e-a61d-434c-b82d-2e42c3cdda80 req-a8d73ac4-4a07-4884-941b-bad17f6b6487 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Received event network-vif-deleted-929759bd-ecdb-42bf-81b8-d0482da18002 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Feb 19 19:56:37 compute-0 nova_compute[186662]: 2026-02-19 19:56:37.262 186666 INFO nova.compute.manager [req-789ae73e-a61d-434c-b82d-2e42c3cdda80 req-a8d73ac4-4a07-4884-941b-bad17f6b6487 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Neutron deleted interface 929759bd-ecdb-42bf-81b8-d0482da18002; detaching it from the instance and deleting it from the info cache
Feb 19 19:56:37 compute-0 nova_compute[186662]: 2026-02-19 19:56:37.262 186666 DEBUG nova.network.neutron [req-789ae73e-a61d-434c-b82d-2e42c3cdda80 req-a8d73ac4-4a07-4884-941b-bad17f6b6487 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:56:37 compute-0 nova_compute[186662]: 2026-02-19 19:56:37.466 186666 DEBUG nova.network.neutron [-] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Feb 19 19:56:37 compute-0 nova_compute[186662]: 2026-02-19 19:56:37.771 186666 DEBUG nova.compute.manager [req-789ae73e-a61d-434c-b82d-2e42c3cdda80 req-a8d73ac4-4a07-4884-941b-bad17f6b6487 11a5acbfea3846a8b5cc436152d134ac 3b02f24dc91d423a9c421b37ff0cc6b9 - - default default] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Detach interface failed, port_id=929759bd-ecdb-42bf-81b8-d0482da18002, reason: Instance 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Feb 19 19:56:37 compute-0 nova_compute[186662]: 2026-02-19 19:56:37.972 186666 INFO nova.compute.manager [-] [instance: 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6] Took 1.64 seconds to deallocate network for instance.
Feb 19 19:56:38 compute-0 nova_compute[186662]: 2026-02-19 19:56:38.493 186666 DEBUG oslo_concurrency.lockutils [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:38 compute-0 nova_compute[186662]: 2026-02-19 19:56:38.494 186666 DEBUG oslo_concurrency.lockutils [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:38 compute-0 nova_compute[186662]: 2026-02-19 19:56:38.500 186666 DEBUG oslo_concurrency.lockutils [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:38 compute-0 nova_compute[186662]: 2026-02-19 19:56:38.528 186666 INFO nova.scheduler.client.report [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Deleted allocations for instance 3d326f22-9bd5-4bf2-a7dc-5f425116bfc6
Feb 19 19:56:38 compute-0 nova_compute[186662]: 2026-02-19 19:56:38.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:56:39 compute-0 nova_compute[186662]: 2026-02-19 19:56:39.553 186666 DEBUG oslo_concurrency.lockutils [None req-82277c57-076f-4c97-9070-f46ed5f34f62 6f190c8d209a43b19d4cba5936ab90e0 dafcf6090521473590cdb432a889739e - - default default] Lock "3d326f22-9bd5-4bf2-a7dc-5f425116bfc6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.064s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:40 compute-0 nova_compute[186662]: 2026-02-19 19:56:40.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:56:40 compute-0 nova_compute[186662]: 2026-02-19 19:56:40.810 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:41 compute-0 nova_compute[186662]: 2026-02-19 19:56:41.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:56:41 compute-0 nova_compute[186662]: 2026-02-19 19:56:41.576 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:56:41 compute-0 nova_compute[186662]: 2026-02-19 19:56:41.908 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:42 compute-0 nova_compute[186662]: 2026-02-19 19:56:42.410 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:43 compute-0 nova_compute[186662]: 2026-02-19 19:56:43.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:56:44 compute-0 sshd-session[219583]: Invalid user titu from 96.78.175.42 port 44364
Feb 19 19:56:44 compute-0 sshd-session[219583]: Received disconnect from 96.78.175.42 port 44364:11: Bye Bye [preauth]
Feb 19 19:56:44 compute-0 sshd-session[219583]: Disconnected from invalid user titu 96.78.175.42 port 44364 [preauth]
Feb 19 19:56:44 compute-0 nova_compute[186662]: 2026-02-19 19:56:44.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:56:44 compute-0 nova_compute[186662]: 2026-02-19 19:56:44.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:56:45 compute-0 nova_compute[186662]: 2026-02-19 19:56:45.088 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:45 compute-0 nova_compute[186662]: 2026-02-19 19:56:45.089 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:45 compute-0 nova_compute[186662]: 2026-02-19 19:56:45.089 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:45 compute-0 nova_compute[186662]: 2026-02-19 19:56:45.089 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:56:45 compute-0 nova_compute[186662]: 2026-02-19 19:56:45.215 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:56:45 compute-0 nova_compute[186662]: 2026-02-19 19:56:45.216 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:56:45 compute-0 nova_compute[186662]: 2026-02-19 19:56:45.229 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:56:45 compute-0 nova_compute[186662]: 2026-02-19 19:56:45.230 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5771MB free_disk=72.96757125854492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:56:45 compute-0 nova_compute[186662]: 2026-02-19 19:56:45.230 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:56:45 compute-0 nova_compute[186662]: 2026-02-19 19:56:45.230 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:56:45 compute-0 nova_compute[186662]: 2026-02-19 19:56:45.854 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:46 compute-0 nova_compute[186662]: 2026-02-19 19:56:46.323 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:56:46 compute-0 nova_compute[186662]: 2026-02-19 19:56:46.324 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:56:45 up  1:27,  0 user,  load average: 0.13, 0.13, 0.18\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:56:46 compute-0 nova_compute[186662]: 2026-02-19 19:56:46.381 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:56:46 compute-0 nova_compute[186662]: 2026-02-19 19:56:46.890 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:56:46 compute-0 nova_compute[186662]: 2026-02-19 19:56:46.936 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:47 compute-0 nova_compute[186662]: 2026-02-19 19:56:47.402 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:56:47 compute-0 nova_compute[186662]: 2026-02-19 19:56:47.403 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.173s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:56:49 compute-0 nova_compute[186662]: 2026-02-19 19:56:49.327 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:56:50 compute-0 podman[219587]: 2026-02-19 19:56:50.329256384 +0000 UTC m=+0.102923988 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 19 19:56:50 compute-0 nova_compute[186662]: 2026-02-19 19:56:50.858 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:51 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:51.116 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:34:25 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2edf70e1-d524-464f-87dc-3edb859728aa', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2edf70e1-d524-464f-87dc-3edb859728aa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9a355e7bba924709a3806e0b5f969517', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=35cdf083-9d06-406c-a622-7b7a984c789b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9df55fa3-8cd7-4e88-add9-1256d694a1c6) old=Port_Binding(mac=['fa:16:3e:3b:34:25'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-2edf70e1-d524-464f-87dc-3edb859728aa', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2edf70e1-d524-464f-87dc-3edb859728aa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9a355e7bba924709a3806e0b5f969517', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:56:51 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:51.117 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9df55fa3-8cd7-4e88-add9-1256d694a1c6 in datapath 2edf70e1-d524-464f-87dc-3edb859728aa updated
Feb 19 19:56:51 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:51.118 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2edf70e1-d524-464f-87dc-3edb859728aa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:56:51 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:51.119 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[64a34320-10ae-4a93-8c33-5c485a66deb2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:51 compute-0 nova_compute[186662]: 2026-02-19 19:56:51.939 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:55 compute-0 podman[219608]: 2026-02-19 19:56:55.265395593 +0000 UTC m=+0.046387196 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter)
Feb 19 19:56:55 compute-0 nova_compute[186662]: 2026-02-19 19:56:55.859 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:56 compute-0 nova_compute[186662]: 2026-02-19 19:56:56.941 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:56:57 compute-0 podman[219630]: 2026-02-19 19:56:57.311479971 +0000 UTC m=+0.091129691 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest)
Feb 19 19:56:57 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:57.640 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:45:e6 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-13e761e0-0929-4e9b-86c1-d2af968a62b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13e761e0-0929-4e9b-86c1-d2af968a62b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f235690cc41a43288281fc2088132c14', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bff6754-0061-49b1-a583-28e6d3e28ef7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=db07317f-1a2f-41db-b5eb-8e2f3bc3a9bc) old=Port_Binding(mac=['fa:16:3e:eb:45:e6'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-13e761e0-0929-4e9b-86c1-d2af968a62b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13e761e0-0929-4e9b-86c1-d2af968a62b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f235690cc41a43288281fc2088132c14', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:56:57 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:57.641 105986 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port db07317f-1a2f-41db-b5eb-8e2f3bc3a9bc in datapath 13e761e0-0929-4e9b-86c1-d2af968a62b0 updated
Feb 19 19:56:57 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:57.641 105986 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 13e761e0-0929-4e9b-86c1-d2af968a62b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Feb 19 19:56:57 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:56:57.642 207540 DEBUG oslo.privsep.daemon [-] privsep: reply[b1c33a88-522a-4918-b52c-83373efeb57a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Feb 19 19:56:59 compute-0 podman[196025]: time="2026-02-19T19:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:56:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:56:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2202 "" "Go-http-client/1.1"
Feb 19 19:57:00 compute-0 podman[219658]: 2026-02-19 19:57:00.30124624 +0000 UTC m=+0.072841828 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:57:00 compute-0 nova_compute[186662]: 2026-02-19 19:57:00.861 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:01 compute-0 openstack_network_exporter[198916]: ERROR   19:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:57:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:57:01 compute-0 openstack_network_exporter[198916]: ERROR   19:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:57:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:57:01 compute-0 nova_compute[186662]: 2026-02-19 19:57:01.943 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:05 compute-0 nova_compute[186662]: 2026-02-19 19:57:05.863 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:06 compute-0 nova_compute[186662]: 2026-02-19 19:57:06.946 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:10 compute-0 nova_compute[186662]: 2026-02-19 19:57:10.865 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:57:11.429 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:57:11 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:57:11.430 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:57:11 compute-0 nova_compute[186662]: 2026-02-19 19:57:11.448 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:11 compute-0 nova_compute[186662]: 2026-02-19 19:57:11.948 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:15 compute-0 ovn_controller[96653]: 2026-02-19T19:57:15Z|00269|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 19 19:57:15 compute-0 nova_compute[186662]: 2026-02-19 19:57:15.866 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:16 compute-0 nova_compute[186662]: 2026-02-19 19:57:16.949 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:18 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:57:18.431 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:57:20 compute-0 nova_compute[186662]: 2026-02-19 19:57:20.868 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:21 compute-0 podman[219684]: 2026-02-19 19:57:21.288517482 +0000 UTC m=+0.066532346 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 19 19:57:21 compute-0 nova_compute[186662]: 2026-02-19 19:57:21.950 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:25 compute-0 nova_compute[186662]: 2026-02-19 19:57:25.870 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:25 compute-0 podman[219703]: 2026-02-19 19:57:25.93729399 +0000 UTC m=+0.046051958 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, maintainer=Red Hat, Inc., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 19 19:57:26 compute-0 nova_compute[186662]: 2026-02-19 19:57:26.953 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:28 compute-0 podman[219726]: 2026-02-19 19:57:28.341499005 +0000 UTC m=+0.118714170 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 19 19:57:29 compute-0 podman[196025]: time="2026-02-19T19:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:57:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:57:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Feb 19 19:57:30 compute-0 nova_compute[186662]: 2026-02-19 19:57:30.872 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:31 compute-0 podman[219752]: 2026-02-19 19:57:31.261534422 +0000 UTC m=+0.043196619 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 19:57:31 compute-0 openstack_network_exporter[198916]: ERROR   19:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:57:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:57:31 compute-0 openstack_network_exporter[198916]: ERROR   19:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:57:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:57:31 compute-0 nova_compute[186662]: 2026-02-19 19:57:31.953 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:57:32.171 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:57:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:57:32.171 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:57:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:57:32.172 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:57:33 compute-0 nova_compute[186662]: 2026-02-19 19:57:33.082 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:57:35 compute-0 nova_compute[186662]: 2026-02-19 19:57:35.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:57:35 compute-0 nova_compute[186662]: 2026-02-19 19:57:35.873 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:36 compute-0 nova_compute[186662]: 2026-02-19 19:57:36.956 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:40 compute-0 nova_compute[186662]: 2026-02-19 19:57:40.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:57:40 compute-0 nova_compute[186662]: 2026-02-19 19:57:40.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:57:40 compute-0 nova_compute[186662]: 2026-02-19 19:57:40.875 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:41 compute-0 nova_compute[186662]: 2026-02-19 19:57:41.957 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:42 compute-0 nova_compute[186662]: 2026-02-19 19:57:42.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:57:42 compute-0 nova_compute[186662]: 2026-02-19 19:57:42.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:57:43 compute-0 nova_compute[186662]: 2026-02-19 19:57:43.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:57:44 compute-0 nova_compute[186662]: 2026-02-19 19:57:44.080 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:57:44 compute-0 nova_compute[186662]: 2026-02-19 19:57:44.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:57:45 compute-0 nova_compute[186662]: 2026-02-19 19:57:45.094 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:57:45 compute-0 nova_compute[186662]: 2026-02-19 19:57:45.094 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:57:45 compute-0 nova_compute[186662]: 2026-02-19 19:57:45.094 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:57:45 compute-0 nova_compute[186662]: 2026-02-19 19:57:45.094 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:57:45 compute-0 nova_compute[186662]: 2026-02-19 19:57:45.207 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:57:45 compute-0 nova_compute[186662]: 2026-02-19 19:57:45.208 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:57:45 compute-0 nova_compute[186662]: 2026-02-19 19:57:45.219 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:57:45 compute-0 nova_compute[186662]: 2026-02-19 19:57:45.219 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5811MB free_disk=72.96762084960938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:57:45 compute-0 nova_compute[186662]: 2026-02-19 19:57:45.220 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:57:45 compute-0 nova_compute[186662]: 2026-02-19 19:57:45.220 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:57:45 compute-0 nova_compute[186662]: 2026-02-19 19:57:45.877 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:46 compute-0 nova_compute[186662]: 2026-02-19 19:57:46.345 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:57:46 compute-0 nova_compute[186662]: 2026-02-19 19:57:46.346 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:57:45 up  1:28,  0 user,  load average: 0.05, 0.11, 0.16\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:57:46 compute-0 nova_compute[186662]: 2026-02-19 19:57:46.365 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:57:46 compute-0 nova_compute[186662]: 2026-02-19 19:57:46.872 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:57:46 compute-0 nova_compute[186662]: 2026-02-19 19:57:46.960 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:47 compute-0 nova_compute[186662]: 2026-02-19 19:57:47.380 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:57:47 compute-0 nova_compute[186662]: 2026-02-19 19:57:47.381 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.161s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:57:48 compute-0 nova_compute[186662]: 2026-02-19 19:57:48.381 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:57:50 compute-0 nova_compute[186662]: 2026-02-19 19:57:50.878 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:51 compute-0 nova_compute[186662]: 2026-02-19 19:57:51.961 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:52 compute-0 podman[219781]: 2026-02-19 19:57:52.262698885 +0000 UTC m=+0.035520972 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Feb 19 19:57:55 compute-0 nova_compute[186662]: 2026-02-19 19:57:55.880 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:56 compute-0 podman[219802]: 2026-02-19 19:57:56.28174709 +0000 UTC m=+0.059051844 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1770267347, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 19 19:57:56 compute-0 nova_compute[186662]: 2026-02-19 19:57:56.963 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:57:58 compute-0 sshd-session[219823]: Received disconnect from 45.148.10.147 port 40582:11:  [preauth]
Feb 19 19:57:58 compute-0 sshd-session[219823]: Disconnected from authenticating user root 45.148.10.147 port 40582 [preauth]
Feb 19 19:57:59 compute-0 podman[219825]: 2026-02-19 19:57:59.282193768 +0000 UTC m=+0.060264373 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 19 19:57:59 compute-0 podman[196025]: time="2026-02-19T19:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:57:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:57:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Feb 19 19:58:00 compute-0 nova_compute[186662]: 2026-02-19 19:58:00.882 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:01 compute-0 openstack_network_exporter[198916]: ERROR   19:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:58:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:58:01 compute-0 openstack_network_exporter[198916]: ERROR   19:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:58:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:58:01 compute-0 anacron[49500]: Job `cron.weekly' started
Feb 19 19:58:01 compute-0 anacron[49500]: Job `cron.weekly' terminated
Feb 19 19:58:01 compute-0 nova_compute[186662]: 2026-02-19 19:58:01.964 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:02 compute-0 podman[219853]: 2026-02-19 19:58:02.260390844 +0000 UTC m=+0.038398183 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:58:05 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 19 19:58:05 compute-0 nova_compute[186662]: 2026-02-19 19:58:05.884 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:06 compute-0 nova_compute[186662]: 2026-02-19 19:58:06.965 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:10 compute-0 nova_compute[186662]: 2026-02-19 19:58:10.886 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:11 compute-0 nova_compute[186662]: 2026-02-19 19:58:11.967 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:15 compute-0 nova_compute[186662]: 2026-02-19 19:58:15.888 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:16 compute-0 sshd-session[219879]: Invalid user ftp-test from 103.67.78.251 port 40174
Feb 19 19:58:16 compute-0 sshd-session[219879]: Received disconnect from 103.67.78.251 port 40174:11: Bye Bye [preauth]
Feb 19 19:58:16 compute-0 sshd-session[219879]: Disconnected from invalid user ftp-test 103.67.78.251 port 40174 [preauth]
Feb 19 19:58:16 compute-0 nova_compute[186662]: 2026-02-19 19:58:16.973 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:20 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:58:20.868 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 19:58:20 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:58:20.869 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 19:58:20 compute-0 nova_compute[186662]: 2026-02-19 19:58:20.870 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:20 compute-0 nova_compute[186662]: 2026-02-19 19:58:20.889 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:21 compute-0 nova_compute[186662]: 2026-02-19 19:58:21.974 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:23 compute-0 podman[219885]: 2026-02-19 19:58:23.299497155 +0000 UTC m=+0.075608227 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible)
Feb 19 19:58:25 compute-0 nova_compute[186662]: 2026-02-19 19:58:25.933 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:27 compute-0 nova_compute[186662]: 2026-02-19 19:58:27.001 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:27 compute-0 podman[219905]: 2026-02-19 19:58:27.302350352 +0000 UTC m=+0.080917475 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, release=1770267347, vcs-type=git, vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, distribution-scope=public)
Feb 19 19:58:28 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:58:28.871 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 19:58:29 compute-0 podman[196025]: time="2026-02-19T19:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:58:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:58:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2196 "" "Go-http-client/1.1"
Feb 19 19:58:30 compute-0 podman[219927]: 2026-02-19 19:58:30.303327351 +0000 UTC m=+0.072017488 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 19 19:58:30 compute-0 nova_compute[186662]: 2026-02-19 19:58:30.935 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:31 compute-0 openstack_network_exporter[198916]: ERROR   19:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:58:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:58:31 compute-0 openstack_network_exporter[198916]: ERROR   19:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:58:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:58:32 compute-0 nova_compute[186662]: 2026-02-19 19:58:32.004 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:58:32.172 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:58:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:58:32.173 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:58:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:58:32.173 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:58:33 compute-0 podman[219955]: 2026-02-19 19:58:33.268707978 +0000 UTC m=+0.047961576 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 19:58:33 compute-0 nova_compute[186662]: 2026-02-19 19:58:33.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:58:35 compute-0 nova_compute[186662]: 2026-02-19 19:58:35.936 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:36 compute-0 nova_compute[186662]: 2026-02-19 19:58:36.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:58:37 compute-0 nova_compute[186662]: 2026-02-19 19:58:37.006 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:38 compute-0 sshd-session[219982]: Received disconnect from 106.51.64.128 port 24108:11: Bye Bye [preauth]
Feb 19 19:58:38 compute-0 sshd-session[219982]: Disconnected from authenticating user root 106.51.64.128 port 24108 [preauth]
Feb 19 19:58:40 compute-0 nova_compute[186662]: 2026-02-19 19:58:40.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:58:40 compute-0 nova_compute[186662]: 2026-02-19 19:58:40.938 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:42 compute-0 nova_compute[186662]: 2026-02-19 19:58:42.028 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:42 compute-0 nova_compute[186662]: 2026-02-19 19:58:42.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:58:43 compute-0 nova_compute[186662]: 2026-02-19 19:58:43.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:58:43 compute-0 nova_compute[186662]: 2026-02-19 19:58:43.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:58:43 compute-0 nova_compute[186662]: 2026-02-19 19:58:43.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:58:44 compute-0 nova_compute[186662]: 2026-02-19 19:58:44.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:58:45 compute-0 nova_compute[186662]: 2026-02-19 19:58:45.092 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:58:45 compute-0 nova_compute[186662]: 2026-02-19 19:58:45.092 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:58:45 compute-0 nova_compute[186662]: 2026-02-19 19:58:45.092 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:58:45 compute-0 nova_compute[186662]: 2026-02-19 19:58:45.093 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:58:45 compute-0 nova_compute[186662]: 2026-02-19 19:58:45.232 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:58:45 compute-0 nova_compute[186662]: 2026-02-19 19:58:45.233 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:58:45 compute-0 nova_compute[186662]: 2026-02-19 19:58:45.244 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.011s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:58:45 compute-0 nova_compute[186662]: 2026-02-19 19:58:45.245 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5842MB free_disk=72.96761703491211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:58:45 compute-0 nova_compute[186662]: 2026-02-19 19:58:45.245 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:58:45 compute-0 nova_compute[186662]: 2026-02-19 19:58:45.245 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:58:45 compute-0 nova_compute[186662]: 2026-02-19 19:58:45.940 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:46 compute-0 nova_compute[186662]: 2026-02-19 19:58:46.442 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:58:46 compute-0 nova_compute[186662]: 2026-02-19 19:58:46.442 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:58:45 up  1:29,  0 user,  load average: 0.02, 0.08, 0.15\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:58:46 compute-0 nova_compute[186662]: 2026-02-19 19:58:46.472 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:58:46 compute-0 nova_compute[186662]: 2026-02-19 19:58:46.979 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:58:47 compute-0 nova_compute[186662]: 2026-02-19 19:58:47.030 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:47 compute-0 nova_compute[186662]: 2026-02-19 19:58:47.488 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:58:47 compute-0 nova_compute[186662]: 2026-02-19 19:58:47.489 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.243s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:58:48 compute-0 nova_compute[186662]: 2026-02-19 19:58:48.489 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:58:50 compute-0 nova_compute[186662]: 2026-02-19 19:58:50.942 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:52 compute-0 nova_compute[186662]: 2026-02-19 19:58:52.031 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:52 compute-0 sshd-session[219985]: Connection closed by 103.213.244.180 port 51112 [preauth]
Feb 19 19:58:54 compute-0 podman[219987]: 2026-02-19 19:58:54.269188766 +0000 UTC m=+0.048023287 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 19 19:58:55 compute-0 nova_compute[186662]: 2026-02-19 19:58:55.995 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:57 compute-0 nova_compute[186662]: 2026-02-19 19:58:57.089 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:58:58 compute-0 podman[220007]: 2026-02-19 19:58:58.258733499 +0000 UTC m=+0.039120001 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1770267347)
Feb 19 19:58:59 compute-0 podman[196025]: time="2026-02-19T19:58:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:58:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:58:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:58:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:58:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2199 "" "Go-http-client/1.1"
Feb 19 19:59:00 compute-0 nova_compute[186662]: 2026-02-19 19:59:00.998 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:01 compute-0 sshd-session[220030]: error: kex_exchange_identification: read: Connection reset by peer
Feb 19 19:59:01 compute-0 sshd-session[220030]: Connection reset by 176.120.22.52 port 41509
Feb 19 19:59:01 compute-0 podman[220031]: 2026-02-19 19:59:01.29932734 +0000 UTC m=+0.079479271 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0)
Feb 19 19:59:01 compute-0 openstack_network_exporter[198916]: ERROR   19:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:59:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:59:01 compute-0 openstack_network_exporter[198916]: ERROR   19:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:59:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:59:02 compute-0 nova_compute[186662]: 2026-02-19 19:59:02.089 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:04 compute-0 podman[220059]: 2026-02-19 19:59:04.289946627 +0000 UTC m=+0.055023156 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 19:59:05 compute-0 nova_compute[186662]: 2026-02-19 19:59:05.998 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:07 compute-0 nova_compute[186662]: 2026-02-19 19:59:07.091 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:11 compute-0 nova_compute[186662]: 2026-02-19 19:59:11.000 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:12 compute-0 nova_compute[186662]: 2026-02-19 19:59:12.094 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:16 compute-0 nova_compute[186662]: 2026-02-19 19:59:16.051 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:17 compute-0 nova_compute[186662]: 2026-02-19 19:59:17.125 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:21 compute-0 nova_compute[186662]: 2026-02-19 19:59:21.053 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:22 compute-0 nova_compute[186662]: 2026-02-19 19:59:22.126 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:25 compute-0 podman[220083]: 2026-02-19 19:59:25.264442176 +0000 UTC m=+0.042398150 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 19:59:26 compute-0 sshd-session[220103]: Invalid user n8n from 45.169.200.254 port 34676
Feb 19 19:59:26 compute-0 nova_compute[186662]: 2026-02-19 19:59:26.107 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:26 compute-0 sshd-session[220103]: Received disconnect from 45.169.200.254 port 34676:11: Bye Bye [preauth]
Feb 19 19:59:26 compute-0 sshd-session[220103]: Disconnected from invalid user n8n 45.169.200.254 port 34676 [preauth]
Feb 19 19:59:27 compute-0 nova_compute[186662]: 2026-02-19 19:59:27.126 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:29 compute-0 podman[220105]: 2026-02-19 19:59:29.287427942 +0000 UTC m=+0.069104639 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1770267347, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Feb 19 19:59:29 compute-0 podman[196025]: time="2026-02-19T19:59:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:59:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:59:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:59:29 compute-0 podman[196025]: @ - - [19/Feb/2026:19:59:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Feb 19 19:59:31 compute-0 nova_compute[186662]: 2026-02-19 19:59:31.109 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:31 compute-0 openstack_network_exporter[198916]: ERROR   19:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 19:59:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:59:31 compute-0 openstack_network_exporter[198916]: ERROR   19:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 19:59:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 19:59:32 compute-0 nova_compute[186662]: 2026-02-19 19:59:32.128 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:59:32.173 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:59:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:59:32.174 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:59:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 19:59:32.174 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:59:32 compute-0 podman[220128]: 2026-02-19 19:59:32.317361144 +0000 UTC m=+0.097220071 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 19 19:59:32 compute-0 sshd-session[220155]: Accepted publickey for zuul from 192.168.122.10 port 35076 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 19:59:32 compute-0 systemd-logind[822]: New session 40 of user zuul.
Feb 19 19:59:32 compute-0 systemd[1]: Started Session 40 of User zuul.
Feb 19 19:59:32 compute-0 sshd-session[220155]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 19:59:32 compute-0 sudo[220159]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 19 19:59:32 compute-0 sudo[220159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 19:59:34 compute-0 podman[220296]: 2026-02-19 19:59:34.705417396 +0000 UTC m=+0.048268412 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 19:59:35 compute-0 nova_compute[186662]: 2026-02-19 19:59:35.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:59:36 compute-0 nova_compute[186662]: 2026-02-19 19:59:36.111 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:36 compute-0 ovs-vsctl[220352]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 19 19:59:37 compute-0 nova_compute[186662]: 2026-02-19 19:59:37.128 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:37 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 220183 (sos)
Feb 19 19:59:37 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Feb 19 19:59:37 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Feb 19 19:59:37 compute-0 virtqemud[186157]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 19 19:59:37 compute-0 virtqemud[186157]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 19 19:59:37 compute-0 nova_compute[186662]: 2026-02-19 19:59:37.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:59:37 compute-0 virtqemud[186157]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 19 19:59:38 compute-0 crontab[220763]: (root) LIST (root)
Feb 19 19:59:39 compute-0 kernel: /proc/cgroups lists only v1 controllers, use cgroup.controllers of root cgroup for v2 info
Feb 19 19:59:40 compute-0 systemd[1]: Starting Hostname Service...
Feb 19 19:59:40 compute-0 systemd[1]: Started Hostname Service.
Feb 19 19:59:41 compute-0 nova_compute[186662]: 2026-02-19 19:59:41.167 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:42 compute-0 nova_compute[186662]: 2026-02-19 19:59:42.129 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:42 compute-0 nova_compute[186662]: 2026-02-19 19:59:42.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:59:42 compute-0 nova_compute[186662]: 2026-02-19 19:59:42.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:59:44 compute-0 nova_compute[186662]: 2026-02-19 19:59:44.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:59:45 compute-0 nova_compute[186662]: 2026-02-19 19:59:45.085 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:59:45 compute-0 nova_compute[186662]: 2026-02-19 19:59:45.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:59:45 compute-0 nova_compute[186662]: 2026-02-19 19:59:45.086 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:59:45 compute-0 nova_compute[186662]: 2026-02-19 19:59:45.086 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 19:59:45 compute-0 nova_compute[186662]: 2026-02-19 19:59:45.223 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 19:59:45 compute-0 nova_compute[186662]: 2026-02-19 19:59:45.224 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 19:59:45 compute-0 nova_compute[186662]: 2026-02-19 19:59:45.236 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 19:59:45 compute-0 nova_compute[186662]: 2026-02-19 19:59:45.237 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5425MB free_disk=72.71479034423828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 19:59:45 compute-0 nova_compute[186662]: 2026-02-19 19:59:45.237 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 19:59:45 compute-0 nova_compute[186662]: 2026-02-19 19:59:45.237 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 19:59:46 compute-0 nova_compute[186662]: 2026-02-19 19:59:46.169 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:46 compute-0 nova_compute[186662]: 2026-02-19 19:59:46.279 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 19:59:46 compute-0 nova_compute[186662]: 2026-02-19 19:59:46.279 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:59:45 up  1:30,  0 user,  load average: 0.32, 0.13, 0.16\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 19:59:46 compute-0 nova_compute[186662]: 2026-02-19 19:59:46.306 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing inventories for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Feb 19 19:59:46 compute-0 nova_compute[186662]: 2026-02-19 19:59:46.320 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating ProviderTree inventory for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Feb 19 19:59:46 compute-0 nova_compute[186662]: 2026-02-19 19:59:46.320 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 19:59:46 compute-0 nova_compute[186662]: 2026-02-19 19:59:46.335 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing aggregate associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Feb 19 19:59:46 compute-0 nova_compute[186662]: 2026-02-19 19:59:46.358 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing trait associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_SSE2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,HW_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_USB _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Feb 19 19:59:46 compute-0 nova_compute[186662]: 2026-02-19 19:59:46.386 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 19:59:46 compute-0 nova_compute[186662]: 2026-02-19 19:59:46.892 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 19:59:47 compute-0 nova_compute[186662]: 2026-02-19 19:59:47.167 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:47 compute-0 nova_compute[186662]: 2026-02-19 19:59:47.404 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 19:59:47 compute-0 nova_compute[186662]: 2026-02-19 19:59:47.404 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.167s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 19:59:47 compute-0 ovs-appctl[221952]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 19 19:59:47 compute-0 ovs-appctl[221955]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 19 19:59:48 compute-0 ovs-appctl[221960]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 19 19:59:48 compute-0 nova_compute[186662]: 2026-02-19 19:59:48.400 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:59:48 compute-0 nova_compute[186662]: 2026-02-19 19:59:48.914 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:59:48 compute-0 nova_compute[186662]: 2026-02-19 19:59:48.915 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:59:48 compute-0 nova_compute[186662]: 2026-02-19 19:59:48.915 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 19:59:48 compute-0 nova_compute[186662]: 2026-02-19 19:59:48.916 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 19:59:51 compute-0 nova_compute[186662]: 2026-02-19 19:59:51.171 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:52 compute-0 nova_compute[186662]: 2026-02-19 19:59:52.167 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:53 compute-0 virtqemud[186157]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 19 19:59:54 compute-0 sshd-session[223052]: Received disconnect from 96.78.175.42 port 54324:11: Bye Bye [preauth]
Feb 19 19:59:54 compute-0 sshd-session[223052]: Disconnected from authenticating user root 96.78.175.42 port 54324 [preauth]
Feb 19 19:59:55 compute-0 systemd[1]: Starting Time & Date Service...
Feb 19 19:59:55 compute-0 systemd[1]: Started Time & Date Service.
Feb 19 19:59:55 compute-0 podman[223443]: 2026-02-19 19:59:55.352281102 +0000 UTC m=+0.061119804 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 19 19:59:56 compute-0 nova_compute[186662]: 2026-02-19 19:59:56.174 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:57 compute-0 nova_compute[186662]: 2026-02-19 19:59:57.169 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 19:59:59 compute-0 podman[196025]: time="2026-02-19T19:59:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 19:59:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:59:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 19:59:59 compute-0 podman[196025]: @ - - [19/Feb/2026:19:59:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2199 "" "Go-http-client/1.1"
Feb 19 20:00:00 compute-0 podman[223491]: 2026-02-19 20:00:00.270372893 +0000 UTC m=+0.046055599 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 19 20:00:01 compute-0 nova_compute[186662]: 2026-02-19 20:00:01.176 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:01 compute-0 openstack_network_exporter[198916]: ERROR   20:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:00:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:00:01 compute-0 openstack_network_exporter[198916]: ERROR   20:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:00:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:00:02 compute-0 nova_compute[186662]: 2026-02-19 20:00:02.172 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:02 compute-0 podman[223514]: 2026-02-19 20:00:02.62230644 +0000 UTC m=+0.068793691 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 19 20:00:05 compute-0 podman[223541]: 2026-02-19 20:00:05.270782792 +0000 UTC m=+0.048638941 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 20:00:06 compute-0 nova_compute[186662]: 2026-02-19 20:00:06.179 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:07 compute-0 nova_compute[186662]: 2026-02-19 20:00:07.174 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:11 compute-0 nova_compute[186662]: 2026-02-19 20:00:11.223 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:12 compute-0 nova_compute[186662]: 2026-02-19 20:00:12.178 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:16 compute-0 nova_compute[186662]: 2026-02-19 20:00:16.225 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:16 compute-0 sudo[220159]: pam_unix(sudo:session): session closed for user root
Feb 19 20:00:16 compute-0 sshd-session[220158]: Received disconnect from 192.168.122.10 port 35076:11: disconnected by user
Feb 19 20:00:16 compute-0 sshd-session[220158]: Disconnected from user zuul 192.168.122.10 port 35076
Feb 19 20:00:16 compute-0 sshd-session[220155]: pam_unix(sshd:session): session closed for user zuul
Feb 19 20:00:16 compute-0 systemd-logind[822]: Session 40 logged out. Waiting for processes to exit.
Feb 19 20:00:16 compute-0 systemd[1]: session-40.scope: Deactivated successfully.
Feb 19 20:00:16 compute-0 systemd[1]: session-40.scope: Consumed 1min 12.188s CPU time, 715.1M memory peak, read 313.3M from disk, written 26.1M to disk.
Feb 19 20:00:16 compute-0 systemd-logind[822]: Removed session 40.
Feb 19 20:00:17 compute-0 sshd-session[223566]: Accepted publickey for zuul from 192.168.122.10 port 60930 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 20:00:17 compute-0 systemd-logind[822]: New session 41 of user zuul.
Feb 19 20:00:17 compute-0 nova_compute[186662]: 2026-02-19 20:00:17.234 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:17 compute-0 systemd[1]: Started Session 41 of User zuul.
Feb 19 20:00:17 compute-0 sshd-session[223566]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 20:00:17 compute-0 sudo[223570]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2026-02-19-rgcormg.tar.xz
Feb 19 20:00:17 compute-0 sudo[223570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 20:00:17 compute-0 sudo[223570]: pam_unix(sudo:session): session closed for user root
Feb 19 20:00:17 compute-0 sshd-session[223569]: Received disconnect from 192.168.122.10 port 60930:11: disconnected by user
Feb 19 20:00:17 compute-0 sshd-session[223569]: Disconnected from user zuul 192.168.122.10 port 60930
Feb 19 20:00:17 compute-0 sshd-session[223566]: pam_unix(sshd:session): session closed for user zuul
Feb 19 20:00:17 compute-0 systemd[1]: session-41.scope: Deactivated successfully.
Feb 19 20:00:17 compute-0 systemd-logind[822]: Session 41 logged out. Waiting for processes to exit.
Feb 19 20:00:17 compute-0 systemd-logind[822]: Removed session 41.
Feb 19 20:00:17 compute-0 sshd-session[223595]: Accepted publickey for zuul from 192.168.122.10 port 60944 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 20:00:17 compute-0 systemd-logind[822]: New session 42 of user zuul.
Feb 19 20:00:17 compute-0 systemd[1]: Started Session 42 of User zuul.
Feb 19 20:00:17 compute-0 sshd-session[223595]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 20:00:17 compute-0 sudo[223599]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Feb 19 20:00:17 compute-0 sudo[223599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 20:00:17 compute-0 sudo[223599]: pam_unix(sudo:session): session closed for user root
Feb 19 20:00:17 compute-0 sshd-session[223598]: Received disconnect from 192.168.122.10 port 60944:11: disconnected by user
Feb 19 20:00:17 compute-0 sshd-session[223598]: Disconnected from user zuul 192.168.122.10 port 60944
Feb 19 20:00:17 compute-0 sshd-session[223595]: pam_unix(sshd:session): session closed for user zuul
Feb 19 20:00:17 compute-0 systemd[1]: session-42.scope: Deactivated successfully.
Feb 19 20:00:17 compute-0 systemd-logind[822]: Session 42 logged out. Waiting for processes to exit.
Feb 19 20:00:17 compute-0 systemd-logind[822]: Removed session 42.
Feb 19 20:00:21 compute-0 nova_compute[186662]: 2026-02-19 20:00:21.227 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:22 compute-0 nova_compute[186662]: 2026-02-19 20:00:22.235 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:25 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 19 20:00:25 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 19 20:00:26 compute-0 nova_compute[186662]: 2026-02-19 20:00:26.229 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:26 compute-0 podman[223628]: 2026-02-19 20:00:26.293515432 +0000 UTC m=+0.072508741 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 19 20:00:27 compute-0 nova_compute[186662]: 2026-02-19 20:00:27.236 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:29 compute-0 podman[196025]: time="2026-02-19T20:00:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:00:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:00:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:00:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:00:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Feb 19 20:00:31 compute-0 nova_compute[186662]: 2026-02-19 20:00:31.231 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:31 compute-0 podman[223649]: 2026-02-19 20:00:31.269389356 +0000 UTC m=+0.050809874 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, version=9.7, architecture=x86_64, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., release=1770267347, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 19 20:00:31 compute-0 openstack_network_exporter[198916]: ERROR   20:00:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:00:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:00:31 compute-0 openstack_network_exporter[198916]: ERROR   20:00:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:00:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:00:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:00:32.174 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:00:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:00:32.176 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:00:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:00:32.176 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:00:32 compute-0 nova_compute[186662]: 2026-02-19 20:00:32.237 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:33 compute-0 podman[223671]: 2026-02-19 20:00:33.343740043 +0000 UTC m=+0.118728352 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 19 20:00:36 compute-0 nova_compute[186662]: 2026-02-19 20:00:36.232 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:36 compute-0 podman[223697]: 2026-02-19 20:00:36.26219266 +0000 UTC m=+0.039357736 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 20:00:36 compute-0 nova_compute[186662]: 2026-02-19 20:00:36.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:00:37 compute-0 nova_compute[186662]: 2026-02-19 20:00:37.247 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:38 compute-0 nova_compute[186662]: 2026-02-19 20:00:38.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:00:39 compute-0 nova_compute[186662]: 2026-02-19 20:00:39.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:00:41 compute-0 nova_compute[186662]: 2026-02-19 20:00:41.234 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:42 compute-0 nova_compute[186662]: 2026-02-19 20:00:42.249 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:43 compute-0 nova_compute[186662]: 2026-02-19 20:00:43.083 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:00:43 compute-0 nova_compute[186662]: 2026-02-19 20:00:43.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:00:44 compute-0 nova_compute[186662]: 2026-02-19 20:00:44.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:00:45 compute-0 nova_compute[186662]: 2026-02-19 20:00:45.094 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:00:45 compute-0 nova_compute[186662]: 2026-02-19 20:00:45.094 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:00:45 compute-0 nova_compute[186662]: 2026-02-19 20:00:45.095 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:00:45 compute-0 nova_compute[186662]: 2026-02-19 20:00:45.095 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 20:00:45 compute-0 nova_compute[186662]: 2026-02-19 20:00:45.221 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 20:00:45 compute-0 nova_compute[186662]: 2026-02-19 20:00:45.221 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 20:00:45 compute-0 nova_compute[186662]: 2026-02-19 20:00:45.237 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 20:00:45 compute-0 nova_compute[186662]: 2026-02-19 20:00:45.238 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5665MB free_disk=72.96735000610352GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 20:00:45 compute-0 nova_compute[186662]: 2026-02-19 20:00:45.238 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:00:45 compute-0 nova_compute[186662]: 2026-02-19 20:00:45.238 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:00:46 compute-0 nova_compute[186662]: 2026-02-19 20:00:46.236 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:46 compute-0 nova_compute[186662]: 2026-02-19 20:00:46.288 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 20:00:46 compute-0 nova_compute[186662]: 2026-02-19 20:00:46.288 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:00:45 up  1:31,  0 user,  load average: 0.49, 0.24, 0.20\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 20:00:46 compute-0 nova_compute[186662]: 2026-02-19 20:00:46.306 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 20:00:46 compute-0 nova_compute[186662]: 2026-02-19 20:00:46.812 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 20:00:47 compute-0 nova_compute[186662]: 2026-02-19 20:00:47.250 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:47 compute-0 nova_compute[186662]: 2026-02-19 20:00:47.320 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 20:00:47 compute-0 nova_compute[186662]: 2026-02-19 20:00:47.321 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.083s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:00:48 compute-0 nova_compute[186662]: 2026-02-19 20:00:48.321 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:00:48 compute-0 nova_compute[186662]: 2026-02-19 20:00:48.321 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:00:48 compute-0 nova_compute[186662]: 2026-02-19 20:00:48.321 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:00:48 compute-0 nova_compute[186662]: 2026-02-19 20:00:48.321 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 20:00:51 compute-0 nova_compute[186662]: 2026-02-19 20:00:51.240 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:52 compute-0 nova_compute[186662]: 2026-02-19 20:00:52.252 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:56 compute-0 nova_compute[186662]: 2026-02-19 20:00:56.242 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:57 compute-0 nova_compute[186662]: 2026-02-19 20:00:57.254 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:00:57 compute-0 podman[223723]: 2026-02-19 20:00:57.265718656 +0000 UTC m=+0.046237443 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 19 20:00:59 compute-0 podman[196025]: time="2026-02-19T20:00:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:00:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:00:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:00:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:00:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2194 "" "Go-http-client/1.1"
Feb 19 20:01:00 compute-0 nova_compute[186662]: 2026-02-19 20:01:00.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:01:00 compute-0 nova_compute[186662]: 2026-02-19 20:01:00.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Feb 19 20:01:01 compute-0 nova_compute[186662]: 2026-02-19 20:01:01.084 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Feb 19 20:01:01 compute-0 nova_compute[186662]: 2026-02-19 20:01:01.244 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:01 compute-0 openstack_network_exporter[198916]: ERROR   20:01:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:01:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:01:01 compute-0 openstack_network_exporter[198916]: ERROR   20:01:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:01:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:01:01 compute-0 CROND[223744]: (root) CMD (run-parts /etc/cron.hourly)
Feb 19 20:01:01 compute-0 run-parts[223747]: (/etc/cron.hourly) starting 0anacron
Feb 19 20:01:01 compute-0 run-parts[223753]: (/etc/cron.hourly) finished 0anacron
Feb 19 20:01:01 compute-0 CROND[223743]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 19 20:01:02 compute-0 nova_compute[186662]: 2026-02-19 20:01:02.255 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:02 compute-0 podman[223754]: 2026-02-19 20:01:02.281455027 +0000 UTC m=+0.055604130 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, version=9.7, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Feb 19 20:01:02 compute-0 nova_compute[186662]: 2026-02-19 20:01:02.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:01:02 compute-0 nova_compute[186662]: 2026-02-19 20:01:02.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Feb 19 20:01:04 compute-0 podman[223776]: 2026-02-19 20:01:04.294399715 +0000 UTC m=+0.073081065 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 19 20:01:06 compute-0 nova_compute[186662]: 2026-02-19 20:01:06.245 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:07 compute-0 nova_compute[186662]: 2026-02-19 20:01:07.258 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:07 compute-0 podman[223803]: 2026-02-19 20:01:07.265480749 +0000 UTC m=+0.046020958 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 20:01:11 compute-0 nova_compute[186662]: 2026-02-19 20:01:11.248 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:12 compute-0 nova_compute[186662]: 2026-02-19 20:01:12.261 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:16 compute-0 nova_compute[186662]: 2026-02-19 20:01:16.262 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:17 compute-0 nova_compute[186662]: 2026-02-19 20:01:17.262 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:21 compute-0 nova_compute[186662]: 2026-02-19 20:01:21.264 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:22 compute-0 nova_compute[186662]: 2026-02-19 20:01:22.265 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:26 compute-0 nova_compute[186662]: 2026-02-19 20:01:26.266 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:27 compute-0 nova_compute[186662]: 2026-02-19 20:01:27.266 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:28 compute-0 podman[223827]: 2026-02-19 20:01:28.265357983 +0000 UTC m=+0.042762738 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Feb 19 20:01:29 compute-0 sshd-session[223848]: Invalid user n8n from 106.51.64.128 port 40792
Feb 19 20:01:29 compute-0 podman[196025]: time="2026-02-19T20:01:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:01:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:01:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:01:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:01:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2194 "" "Go-http-client/1.1"
Feb 19 20:01:29 compute-0 sshd-session[223848]: Received disconnect from 106.51.64.128 port 40792:11: Bye Bye [preauth]
Feb 19 20:01:29 compute-0 sshd-session[223848]: Disconnected from invalid user n8n 106.51.64.128 port 40792 [preauth]
Feb 19 20:01:31 compute-0 nova_compute[186662]: 2026-02-19 20:01:31.269 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:31 compute-0 openstack_network_exporter[198916]: ERROR   20:01:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:01:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:01:31 compute-0 openstack_network_exporter[198916]: ERROR   20:01:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:01:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:01:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:01:32.177 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:01:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:01:32.177 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:01:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:01:32.177 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:01:32 compute-0 nova_compute[186662]: 2026-02-19 20:01:32.270 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:33 compute-0 podman[223851]: 2026-02-19 20:01:33.275448717 +0000 UTC m=+0.054070054 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1770267347, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Feb 19 20:01:35 compute-0 podman[223874]: 2026-02-19 20:01:35.35579192 +0000 UTC m=+0.134801252 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 19 20:01:36 compute-0 nova_compute[186662]: 2026-02-19 20:01:36.271 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:37 compute-0 nova_compute[186662]: 2026-02-19 20:01:37.104 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:01:37 compute-0 nova_compute[186662]: 2026-02-19 20:01:37.270 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:38 compute-0 podman[223901]: 2026-02-19 20:01:38.259722654 +0000 UTC m=+0.040193476 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 20:01:39 compute-0 nova_compute[186662]: 2026-02-19 20:01:39.577 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:01:41 compute-0 nova_compute[186662]: 2026-02-19 20:01:41.273 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:42 compute-0 nova_compute[186662]: 2026-02-19 20:01:42.291 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:43 compute-0 nova_compute[186662]: 2026-02-19 20:01:43.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:01:44 compute-0 nova_compute[186662]: 2026-02-19 20:01:44.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:01:45 compute-0 nova_compute[186662]: 2026-02-19 20:01:45.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:01:45 compute-0 nova_compute[186662]: 2026-02-19 20:01:45.576 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 20:01:46 compute-0 nova_compute[186662]: 2026-02-19 20:01:46.274 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:46 compute-0 nova_compute[186662]: 2026-02-19 20:01:46.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:01:47 compute-0 nova_compute[186662]: 2026-02-19 20:01:47.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:01:47 compute-0 nova_compute[186662]: 2026-02-19 20:01:47.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:01:47 compute-0 nova_compute[186662]: 2026-02-19 20:01:47.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:01:47 compute-0 nova_compute[186662]: 2026-02-19 20:01:47.091 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 20:01:47 compute-0 nova_compute[186662]: 2026-02-19 20:01:47.208 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 20:01:47 compute-0 nova_compute[186662]: 2026-02-19 20:01:47.209 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 20:01:47 compute-0 nova_compute[186662]: 2026-02-19 20:01:47.219 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.011s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 20:01:47 compute-0 nova_compute[186662]: 2026-02-19 20:01:47.220 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5755MB free_disk=72.96735000610352GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 20:01:47 compute-0 nova_compute[186662]: 2026-02-19 20:01:47.220 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:01:47 compute-0 nova_compute[186662]: 2026-02-19 20:01:47.220 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:01:47 compute-0 nova_compute[186662]: 2026-02-19 20:01:47.293 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:48 compute-0 nova_compute[186662]: 2026-02-19 20:01:48.266 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 20:01:48 compute-0 nova_compute[186662]: 2026-02-19 20:01:48.267 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:01:47 up  1:32,  0 user,  load average: 0.16, 0.19, 0.18\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 20:01:48 compute-0 nova_compute[186662]: 2026-02-19 20:01:48.501 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 20:01:49 compute-0 nova_compute[186662]: 2026-02-19 20:01:49.007 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 20:01:49 compute-0 nova_compute[186662]: 2026-02-19 20:01:49.515 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 20:01:49 compute-0 nova_compute[186662]: 2026-02-19 20:01:49.515 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.295s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:01:50 compute-0 nova_compute[186662]: 2026-02-19 20:01:50.515 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:01:50 compute-0 nova_compute[186662]: 2026-02-19 20:01:50.515 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:01:50 compute-0 nova_compute[186662]: 2026-02-19 20:01:50.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:01:51 compute-0 nova_compute[186662]: 2026-02-19 20:01:51.277 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:52 compute-0 nova_compute[186662]: 2026-02-19 20:01:52.296 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:56 compute-0 nova_compute[186662]: 2026-02-19 20:01:56.281 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:57 compute-0 nova_compute[186662]: 2026-02-19 20:01:57.299 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:01:59 compute-0 podman[223926]: 2026-02-19 20:01:59.263464112 +0000 UTC m=+0.045009684 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 20:01:59 compute-0 podman[196025]: time="2026-02-19T20:01:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:01:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:01:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:01:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:01:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Feb 19 20:02:01 compute-0 nova_compute[186662]: 2026-02-19 20:02:01.283 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:01 compute-0 openstack_network_exporter[198916]: ERROR   20:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:02:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:02:01 compute-0 openstack_network_exporter[198916]: ERROR   20:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:02:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:02:02 compute-0 nova_compute[186662]: 2026-02-19 20:02:02.300 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:04 compute-0 podman[223945]: 2026-02-19 20:02:04.268639607 +0000 UTC m=+0.050285282 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, architecture=x86_64, name=ubi9/ubi-minimal, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Feb 19 20:02:06 compute-0 sshd-session[223967]: Invalid user claude from 103.67.78.251 port 48242
Feb 19 20:02:06 compute-0 podman[223969]: 2026-02-19 20:02:06.152530953 +0000 UTC m=+0.057542668 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 19 20:02:06 compute-0 nova_compute[186662]: 2026-02-19 20:02:06.284 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:06 compute-0 sshd-session[223967]: Received disconnect from 103.67.78.251 port 48242:11: Bye Bye [preauth]
Feb 19 20:02:06 compute-0 sshd-session[223967]: Disconnected from invalid user claude 103.67.78.251 port 48242 [preauth]
Feb 19 20:02:07 compute-0 nova_compute[186662]: 2026-02-19 20:02:07.301 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:09 compute-0 podman[223996]: 2026-02-19 20:02:09.289876443 +0000 UTC m=+0.056655147 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 20:02:11 compute-0 nova_compute[186662]: 2026-02-19 20:02:11.286 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:12 compute-0 nova_compute[186662]: 2026-02-19 20:02:12.303 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:16 compute-0 nova_compute[186662]: 2026-02-19 20:02:16.289 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:17 compute-0 nova_compute[186662]: 2026-02-19 20:02:17.307 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:21 compute-0 nova_compute[186662]: 2026-02-19 20:02:21.292 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:22 compute-0 nova_compute[186662]: 2026-02-19 20:02:22.309 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:26 compute-0 nova_compute[186662]: 2026-02-19 20:02:26.295 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:27 compute-0 nova_compute[186662]: 2026-02-19 20:02:27.310 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:29 compute-0 podman[196025]: time="2026-02-19T20:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:02:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:02:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:02:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Feb 19 20:02:30 compute-0 podman[224020]: 2026-02-19 20:02:30.259259986 +0000 UTC m=+0.037768878 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 19 20:02:31 compute-0 nova_compute[186662]: 2026-02-19 20:02:31.297 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:31 compute-0 openstack_network_exporter[198916]: ERROR   20:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:02:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:02:31 compute-0 openstack_network_exporter[198916]: ERROR   20:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:02:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:02:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:02:32.178 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:02:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:02:32.179 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:02:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:02:32.179 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:02:32 compute-0 nova_compute[186662]: 2026-02-19 20:02:32.313 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:35 compute-0 podman[224040]: 2026-02-19 20:02:35.28190292 +0000 UTC m=+0.061188958 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, maintainer=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z)
Feb 19 20:02:36 compute-0 podman[224061]: 2026-02-19 20:02:36.280341044 +0000 UTC m=+0.062372198 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 19 20:02:36 compute-0 nova_compute[186662]: 2026-02-19 20:02:36.299 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:37 compute-0 nova_compute[186662]: 2026-02-19 20:02:37.354 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:37 compute-0 nova_compute[186662]: 2026-02-19 20:02:37.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:02:40 compute-0 podman[224088]: 2026-02-19 20:02:40.273691429 +0000 UTC m=+0.048432269 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 20:02:40 compute-0 nova_compute[186662]: 2026-02-19 20:02:40.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:02:41 compute-0 nova_compute[186662]: 2026-02-19 20:02:41.304 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:42 compute-0 nova_compute[186662]: 2026-02-19 20:02:42.357 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:43 compute-0 nova_compute[186662]: 2026-02-19 20:02:43.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:02:46 compute-0 nova_compute[186662]: 2026-02-19 20:02:46.307 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:46 compute-0 nova_compute[186662]: 2026-02-19 20:02:46.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:02:46 compute-0 nova_compute[186662]: 2026-02-19 20:02:46.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:02:46 compute-0 nova_compute[186662]: 2026-02-19 20:02:46.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 20:02:47 compute-0 nova_compute[186662]: 2026-02-19 20:02:47.359 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:47 compute-0 nova_compute[186662]: 2026-02-19 20:02:47.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:02:47 compute-0 nova_compute[186662]: 2026-02-19 20:02:47.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:02:48 compute-0 nova_compute[186662]: 2026-02-19 20:02:48.089 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:02:48 compute-0 nova_compute[186662]: 2026-02-19 20:02:48.089 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:02:48 compute-0 nova_compute[186662]: 2026-02-19 20:02:48.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:02:48 compute-0 nova_compute[186662]: 2026-02-19 20:02:48.090 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 20:02:48 compute-0 nova_compute[186662]: 2026-02-19 20:02:48.190 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 20:02:48 compute-0 nova_compute[186662]: 2026-02-19 20:02:48.191 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 20:02:48 compute-0 nova_compute[186662]: 2026-02-19 20:02:48.203 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 20:02:48 compute-0 nova_compute[186662]: 2026-02-19 20:02:48.203 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5796MB free_disk=72.96735000610352GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 20:02:48 compute-0 nova_compute[186662]: 2026-02-19 20:02:48.204 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:02:48 compute-0 nova_compute[186662]: 2026-02-19 20:02:48.204 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:02:49 compute-0 nova_compute[186662]: 2026-02-19 20:02:49.280 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 20:02:49 compute-0 nova_compute[186662]: 2026-02-19 20:02:49.281 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:02:48 up  1:33,  0 user,  load average: 0.06, 0.15, 0.17\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 20:02:49 compute-0 nova_compute[186662]: 2026-02-19 20:02:49.365 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 20:02:49 compute-0 nova_compute[186662]: 2026-02-19 20:02:49.872 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 20:02:50 compute-0 nova_compute[186662]: 2026-02-19 20:02:50.382 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 20:02:50 compute-0 nova_compute[186662]: 2026-02-19 20:02:50.383 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.179s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:02:51 compute-0 nova_compute[186662]: 2026-02-19 20:02:51.308 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:52 compute-0 nova_compute[186662]: 2026-02-19 20:02:52.360 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:53 compute-0 nova_compute[186662]: 2026-02-19 20:02:53.384 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:02:56 compute-0 nova_compute[186662]: 2026-02-19 20:02:56.310 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:57 compute-0 nova_compute[186662]: 2026-02-19 20:02:57.362 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:02:59 compute-0 podman[196025]: time="2026-02-19T20:02:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:02:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:02:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:02:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:02:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2200 "" "Go-http-client/1.1"
Feb 19 20:03:01 compute-0 nova_compute[186662]: 2026-02-19 20:03:01.312 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:01 compute-0 podman[224115]: 2026-02-19 20:03:01.320502482 +0000 UTC m=+0.079746930 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Feb 19 20:03:01 compute-0 openstack_network_exporter[198916]: ERROR   20:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:03:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:03:01 compute-0 openstack_network_exporter[198916]: ERROR   20:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:03:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:03:02 compute-0 nova_compute[186662]: 2026-02-19 20:03:02.364 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:06 compute-0 podman[224135]: 2026-02-19 20:03:06.270379251 +0000 UTC m=+0.050788415 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, config_id=openstack_network_exporter, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Feb 19 20:03:06 compute-0 nova_compute[186662]: 2026-02-19 20:03:06.314 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:07 compute-0 podman[224158]: 2026-02-19 20:03:07.332286556 +0000 UTC m=+0.099887418 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 19 20:03:07 compute-0 nova_compute[186662]: 2026-02-19 20:03:07.365 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:11 compute-0 podman[224186]: 2026-02-19 20:03:11.270151814 +0000 UTC m=+0.044252958 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 20:03:11 compute-0 nova_compute[186662]: 2026-02-19 20:03:11.315 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:11 compute-0 sshd-session[224184]: Invalid user ftp-test from 45.169.200.254 port 40350
Feb 19 20:03:12 compute-0 sshd-session[224184]: Received disconnect from 45.169.200.254 port 40350:11: Bye Bye [preauth]
Feb 19 20:03:12 compute-0 sshd-session[224184]: Disconnected from invalid user ftp-test 45.169.200.254 port 40350 [preauth]
Feb 19 20:03:12 compute-0 nova_compute[186662]: 2026-02-19 20:03:12.368 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:16 compute-0 nova_compute[186662]: 2026-02-19 20:03:16.317 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:17 compute-0 nova_compute[186662]: 2026-02-19 20:03:17.370 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:21 compute-0 nova_compute[186662]: 2026-02-19 20:03:21.319 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:22 compute-0 nova_compute[186662]: 2026-02-19 20:03:22.372 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:26 compute-0 nova_compute[186662]: 2026-02-19 20:03:26.322 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:27 compute-0 nova_compute[186662]: 2026-02-19 20:03:27.373 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:29 compute-0 podman[196025]: time="2026-02-19T20:03:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:03:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:03:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:03:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:03:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2201 "" "Go-http-client/1.1"
Feb 19 20:03:31 compute-0 nova_compute[186662]: 2026-02-19 20:03:31.324 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:31 compute-0 openstack_network_exporter[198916]: ERROR   20:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:03:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:03:31 compute-0 openstack_network_exporter[198916]: ERROR   20:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:03:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:03:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:03:32.179 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:03:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:03:32.180 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:03:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:03:32.180 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:03:32 compute-0 podman[224211]: 2026-02-19 20:03:32.284591739 +0000 UTC m=+0.060347009 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 20:03:32 compute-0 nova_compute[186662]: 2026-02-19 20:03:32.375 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:36 compute-0 nova_compute[186662]: 2026-02-19 20:03:36.326 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:37 compute-0 podman[224229]: 2026-02-19 20:03:37.272299288 +0000 UTC m=+0.046660145 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.)
Feb 19 20:03:37 compute-0 nova_compute[186662]: 2026-02-19 20:03:37.378 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:37 compute-0 nova_compute[186662]: 2026-02-19 20:03:37.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:03:38 compute-0 podman[224250]: 2026-02-19 20:03:38.327561234 +0000 UTC m=+0.097446641 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 19 20:03:40 compute-0 nova_compute[186662]: 2026-02-19 20:03:40.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:03:41 compute-0 nova_compute[186662]: 2026-02-19 20:03:41.130 186666 WARNING oslo.service.backend._eventlet.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 18.94 sec
Feb 19 20:03:41 compute-0 nova_compute[186662]: 2026-02-19 20:03:41.329 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:42 compute-0 podman[224277]: 2026-02-19 20:03:42.291008649 +0000 UTC m=+0.067065322 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 19 20:03:42 compute-0 nova_compute[186662]: 2026-02-19 20:03:42.381 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:43 compute-0 nova_compute[186662]: 2026-02-19 20:03:43.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:03:46 compute-0 nova_compute[186662]: 2026-02-19 20:03:46.331 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:46 compute-0 nova_compute[186662]: 2026-02-19 20:03:46.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:03:47 compute-0 nova_compute[186662]: 2026-02-19 20:03:47.382 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:48 compute-0 nova_compute[186662]: 2026-02-19 20:03:48.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:03:48 compute-0 nova_compute[186662]: 2026-02-19 20:03:48.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 20:03:49 compute-0 nova_compute[186662]: 2026-02-19 20:03:49.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:03:49 compute-0 nova_compute[186662]: 2026-02-19 20:03:49.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:03:50 compute-0 nova_compute[186662]: 2026-02-19 20:03:50.195 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:03:50 compute-0 nova_compute[186662]: 2026-02-19 20:03:50.195 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:03:50 compute-0 nova_compute[186662]: 2026-02-19 20:03:50.196 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:03:50 compute-0 nova_compute[186662]: 2026-02-19 20:03:50.196 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 20:03:50 compute-0 nova_compute[186662]: 2026-02-19 20:03:50.316 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 20:03:50 compute-0 nova_compute[186662]: 2026-02-19 20:03:50.317 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 20:03:50 compute-0 nova_compute[186662]: 2026-02-19 20:03:50.328 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.011s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 20:03:50 compute-0 nova_compute[186662]: 2026-02-19 20:03:50.329 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5800MB free_disk=72.96742630004883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 20:03:50 compute-0 nova_compute[186662]: 2026-02-19 20:03:50.329 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:03:50 compute-0 nova_compute[186662]: 2026-02-19 20:03:50.329 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:03:51 compute-0 nova_compute[186662]: 2026-02-19 20:03:51.333 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:51 compute-0 nova_compute[186662]: 2026-02-19 20:03:51.647 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 20:03:51 compute-0 nova_compute[186662]: 2026-02-19 20:03:51.648 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:03:50 up  1:34,  0 user,  load average: 0.02, 0.12, 0.16\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 20:03:51 compute-0 nova_compute[186662]: 2026-02-19 20:03:51.664 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 20:03:52 compute-0 nova_compute[186662]: 2026-02-19 20:03:52.386 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:52 compute-0 nova_compute[186662]: 2026-02-19 20:03:52.985 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 20:03:53 compute-0 nova_compute[186662]: 2026-02-19 20:03:53.644 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 20:03:53 compute-0 nova_compute[186662]: 2026-02-19 20:03:53.644 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.315s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:03:55 compute-0 nova_compute[186662]: 2026-02-19 20:03:55.645 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:03:56 compute-0 nova_compute[186662]: 2026-02-19 20:03:56.199 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:03:56 compute-0 nova_compute[186662]: 2026-02-19 20:03:56.335 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:57 compute-0 nova_compute[186662]: 2026-02-19 20:03:57.387 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:03:59 compute-0 podman[196025]: time="2026-02-19T20:03:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:03:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:03:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:03:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:03:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2201 "" "Go-http-client/1.1"
Feb 19 20:04:01 compute-0 nova_compute[186662]: 2026-02-19 20:04:01.337 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:01 compute-0 openstack_network_exporter[198916]: ERROR   20:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:04:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:04:01 compute-0 openstack_network_exporter[198916]: ERROR   20:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:04:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:04:02 compute-0 nova_compute[186662]: 2026-02-19 20:04:02.388 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:03 compute-0 podman[224302]: 2026-02-19 20:04:03.280285613 +0000 UTC m=+0.054823224 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 19 20:04:06 compute-0 nova_compute[186662]: 2026-02-19 20:04:06.361 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:07 compute-0 nova_compute[186662]: 2026-02-19 20:04:07.434 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:08 compute-0 podman[224323]: 2026-02-19 20:04:08.27242071 +0000 UTC m=+0.044795371 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, io.openshift.expose-services=, version=9.7, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 19 20:04:09 compute-0 podman[224344]: 2026-02-19 20:04:09.296647662 +0000 UTC m=+0.073606040 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 19 20:04:11 compute-0 nova_compute[186662]: 2026-02-19 20:04:11.363 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:12 compute-0 nova_compute[186662]: 2026-02-19 20:04:12.435 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:13 compute-0 podman[224370]: 2026-02-19 20:04:13.257585018 +0000 UTC m=+0.039896220 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 20:04:16 compute-0 nova_compute[186662]: 2026-02-19 20:04:16.365 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:17 compute-0 nova_compute[186662]: 2026-02-19 20:04:17.484 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:21 compute-0 nova_compute[186662]: 2026-02-19 20:04:21.367 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:22 compute-0 nova_compute[186662]: 2026-02-19 20:04:22.484 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:26 compute-0 nova_compute[186662]: 2026-02-19 20:04:26.369 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:27 compute-0 nova_compute[186662]: 2026-02-19 20:04:27.486 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:29 compute-0 podman[196025]: time="2026-02-19T20:04:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:04:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:04:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:04:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:04:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2199 "" "Go-http-client/1.1"
Feb 19 20:04:31 compute-0 nova_compute[186662]: 2026-02-19 20:04:31.371 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:31 compute-0 openstack_network_exporter[198916]: ERROR   20:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:04:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:04:31 compute-0 openstack_network_exporter[198916]: ERROR   20:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:04:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:04:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:04:32.181 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:04:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:04:32.181 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:04:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:04:32.181 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:04:32 compute-0 nova_compute[186662]: 2026-02-19 20:04:32.486 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:34 compute-0 podman[224396]: 2026-02-19 20:04:34.284755842 +0000 UTC m=+0.057190552 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 19 20:04:36 compute-0 nova_compute[186662]: 2026-02-19 20:04:36.372 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:37 compute-0 nova_compute[186662]: 2026-02-19 20:04:37.487 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:37 compute-0 nova_compute[186662]: 2026-02-19 20:04:37.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:04:39 compute-0 podman[224416]: 2026-02-19 20:04:39.258248077 +0000 UTC m=+0.039956802 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, vcs-type=git, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1770267347)
Feb 19 20:04:39 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:04:39.589 105986 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fa:5e:c8', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '82:11:e2:4a:c9:a8'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Feb 19 20:04:39 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:04:39.590 105986 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Feb 19 20:04:39 compute-0 nova_compute[186662]: 2026-02-19 20:04:39.591 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:40 compute-0 podman[224439]: 2026-02-19 20:04:40.309117656 +0000 UTC m=+0.089939978 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller)
Feb 19 20:04:40 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:04:40.591 105986 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e8e72127-2f6b-43eb-b51a-e32006a33d3c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 19 20:04:41 compute-0 nova_compute[186662]: 2026-02-19 20:04:41.374 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:42 compute-0 nova_compute[186662]: 2026-02-19 20:04:42.490 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:42 compute-0 nova_compute[186662]: 2026-02-19 20:04:42.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:04:43 compute-0 nova_compute[186662]: 2026-02-19 20:04:43.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:04:44 compute-0 podman[224465]: 2026-02-19 20:04:44.291881143 +0000 UTC m=+0.064357306 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 20:04:46 compute-0 nova_compute[186662]: 2026-02-19 20:04:46.376 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:47 compute-0 nova_compute[186662]: 2026-02-19 20:04:47.492 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:47 compute-0 nova_compute[186662]: 2026-02-19 20:04:47.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:04:49 compute-0 nova_compute[186662]: 2026-02-19 20:04:49.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:04:49 compute-0 nova_compute[186662]: 2026-02-19 20:04:49.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 20:04:49 compute-0 nova_compute[186662]: 2026-02-19 20:04:49.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:04:51 compute-0 nova_compute[186662]: 2026-02-19 20:04:51.377 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:51 compute-0 nova_compute[186662]: 2026-02-19 20:04:51.412 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:04:51 compute-0 nova_compute[186662]: 2026-02-19 20:04:51.412 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:04:51 compute-0 nova_compute[186662]: 2026-02-19 20:04:51.412 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:04:51 compute-0 nova_compute[186662]: 2026-02-19 20:04:51.412 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 20:04:51 compute-0 nova_compute[186662]: 2026-02-19 20:04:51.518 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 20:04:51 compute-0 nova_compute[186662]: 2026-02-19 20:04:51.519 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 20:04:51 compute-0 nova_compute[186662]: 2026-02-19 20:04:51.534 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 20:04:51 compute-0 nova_compute[186662]: 2026-02-19 20:04:51.535 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5800MB free_disk=72.96742630004883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 20:04:51 compute-0 nova_compute[186662]: 2026-02-19 20:04:51.535 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:04:51 compute-0 nova_compute[186662]: 2026-02-19 20:04:51.535 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:04:52 compute-0 nova_compute[186662]: 2026-02-19 20:04:52.545 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:52 compute-0 nova_compute[186662]: 2026-02-19 20:04:52.578 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 20:04:52 compute-0 nova_compute[186662]: 2026-02-19 20:04:52.578 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:04:51 up  1:36,  0 user,  load average: 0.07, 0.11, 0.15\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 20:04:52 compute-0 nova_compute[186662]: 2026-02-19 20:04:52.606 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing inventories for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Feb 19 20:04:52 compute-0 nova_compute[186662]: 2026-02-19 20:04:52.626 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating ProviderTree inventory for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Feb 19 20:04:52 compute-0 nova_compute[186662]: 2026-02-19 20:04:52.626 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 20:04:52 compute-0 nova_compute[186662]: 2026-02-19 20:04:52.641 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing aggregate associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Feb 19 20:04:52 compute-0 nova_compute[186662]: 2026-02-19 20:04:52.669 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing trait associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_SSE2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,HW_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_USB _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Feb 19 20:04:52 compute-0 nova_compute[186662]: 2026-02-19 20:04:52.690 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 20:04:53 compute-0 nova_compute[186662]: 2026-02-19 20:04:53.218 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 20:04:53 compute-0 nova_compute[186662]: 2026-02-19 20:04:53.726 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 20:04:53 compute-0 nova_compute[186662]: 2026-02-19 20:04:53.726 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.191s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:04:54 compute-0 nova_compute[186662]: 2026-02-19 20:04:54.727 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:04:54 compute-0 nova_compute[186662]: 2026-02-19 20:04:54.727 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:04:56 compute-0 nova_compute[186662]: 2026-02-19 20:04:56.379 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:57 compute-0 nova_compute[186662]: 2026-02-19 20:04:57.548 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:04:59 compute-0 podman[196025]: time="2026-02-19T20:04:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:04:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:04:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:04:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:04:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2202 "" "Go-http-client/1.1"
Feb 19 20:05:01 compute-0 nova_compute[186662]: 2026-02-19 20:05:01.381 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:01 compute-0 openstack_network_exporter[198916]: ERROR   20:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:05:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:05:01 compute-0 openstack_network_exporter[198916]: ERROR   20:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:05:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:05:02 compute-0 nova_compute[186662]: 2026-02-19 20:05:02.549 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:05 compute-0 podman[224492]: 2026-02-19 20:05:05.251851195 +0000 UTC m=+0.033135197 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 20:05:06 compute-0 nova_compute[186662]: 2026-02-19 20:05:06.384 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:07 compute-0 nova_compute[186662]: 2026-02-19 20:05:07.554 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:10 compute-0 podman[224511]: 2026-02-19 20:05:10.311585085 +0000 UTC m=+0.086813452 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 19 20:05:10 compute-0 podman[224532]: 2026-02-19 20:05:10.407509987 +0000 UTC m=+0.068260370 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260216)
Feb 19 20:05:11 compute-0 nova_compute[186662]: 2026-02-19 20:05:11.386 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:12 compute-0 nova_compute[186662]: 2026-02-19 20:05:12.554 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:15 compute-0 podman[224560]: 2026-02-19 20:05:15.289566178 +0000 UTC m=+0.061735632 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 20:05:16 compute-0 nova_compute[186662]: 2026-02-19 20:05:16.389 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:17 compute-0 nova_compute[186662]: 2026-02-19 20:05:17.556 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:21 compute-0 nova_compute[186662]: 2026-02-19 20:05:21.392 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:21 compute-0 sshd-session[224559]: error: kex_exchange_identification: read: Connection timed out
Feb 19 20:05:21 compute-0 sshd-session[224559]: banner exchange: Connection from 115.190.95.198 port 47766: Connection timed out
Feb 19 20:05:22 compute-0 nova_compute[186662]: 2026-02-19 20:05:22.559 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:26 compute-0 nova_compute[186662]: 2026-02-19 20:05:26.394 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:27 compute-0 nova_compute[186662]: 2026-02-19 20:05:27.560 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:29 compute-0 podman[196025]: time="2026-02-19T20:05:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:05:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:05:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:05:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:05:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Feb 19 20:05:31 compute-0 nova_compute[186662]: 2026-02-19 20:05:31.396 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:31 compute-0 openstack_network_exporter[198916]: ERROR   20:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:05:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:05:31 compute-0 openstack_network_exporter[198916]: ERROR   20:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:05:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:05:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:05:32.182 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:05:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:05:32.183 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:05:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:05:32.183 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:05:32 compute-0 nova_compute[186662]: 2026-02-19 20:05:32.563 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:33 compute-0 sshd-session[224585]: Received disconnect from 45.148.10.152 port 44848:11:  [preauth]
Feb 19 20:05:33 compute-0 sshd-session[224585]: Disconnected from authenticating user root 45.148.10.152 port 44848 [preauth]
Feb 19 20:05:36 compute-0 podman[224587]: 2026-02-19 20:05:36.274618147 +0000 UTC m=+0.045211140 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Feb 19 20:05:36 compute-0 nova_compute[186662]: 2026-02-19 20:05:36.398 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:37 compute-0 nova_compute[186662]: 2026-02-19 20:05:37.565 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:39 compute-0 nova_compute[186662]: 2026-02-19 20:05:39.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:05:41 compute-0 podman[224606]: 2026-02-19 20:05:41.288444282 +0000 UTC m=+0.065938524 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 19 20:05:41 compute-0 podman[224607]: 2026-02-19 20:05:41.3081074 +0000 UTC m=+0.078009948 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, distribution-scope=public, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 19 20:05:41 compute-0 nova_compute[186662]: 2026-02-19 20:05:41.399 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:42 compute-0 nova_compute[186662]: 2026-02-19 20:05:42.566 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:43 compute-0 nova_compute[186662]: 2026-02-19 20:05:43.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:05:43 compute-0 nova_compute[186662]: 2026-02-19 20:05:43.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:05:46 compute-0 podman[224653]: 2026-02-19 20:05:46.263297768 +0000 UTC m=+0.045022256 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 20:05:46 compute-0 nova_compute[186662]: 2026-02-19 20:05:46.401 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:47 compute-0 nova_compute[186662]: 2026-02-19 20:05:47.568 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:48 compute-0 nova_compute[186662]: 2026-02-19 20:05:48.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:05:48 compute-0 nova_compute[186662]: 2026-02-19 20:05:48.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:05:50 compute-0 nova_compute[186662]: 2026-02-19 20:05:50.083 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:05:50 compute-0 nova_compute[186662]: 2026-02-19 20:05:50.596 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:05:50 compute-0 nova_compute[186662]: 2026-02-19 20:05:50.596 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:05:50 compute-0 nova_compute[186662]: 2026-02-19 20:05:50.596 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:05:50 compute-0 nova_compute[186662]: 2026-02-19 20:05:50.596 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 20:05:50 compute-0 nova_compute[186662]: 2026-02-19 20:05:50.708 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 20:05:50 compute-0 nova_compute[186662]: 2026-02-19 20:05:50.709 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 20:05:50 compute-0 nova_compute[186662]: 2026-02-19 20:05:50.725 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 20:05:50 compute-0 nova_compute[186662]: 2026-02-19 20:05:50.726 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5813MB free_disk=72.96742630004883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 20:05:50 compute-0 nova_compute[186662]: 2026-02-19 20:05:50.726 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:05:50 compute-0 nova_compute[186662]: 2026-02-19 20:05:50.726 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:05:51 compute-0 nova_compute[186662]: 2026-02-19 20:05:51.403 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:51 compute-0 nova_compute[186662]: 2026-02-19 20:05:51.782 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 20:05:51 compute-0 nova_compute[186662]: 2026-02-19 20:05:51.782 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:05:50 up  1:37,  0 user,  load average: 0.02, 0.09, 0.14\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 20:05:51 compute-0 nova_compute[186662]: 2026-02-19 20:05:51.806 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 20:05:52 compute-0 nova_compute[186662]: 2026-02-19 20:05:52.321 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 20:05:52 compute-0 nova_compute[186662]: 2026-02-19 20:05:52.570 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:52 compute-0 nova_compute[186662]: 2026-02-19 20:05:52.832 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 20:05:52 compute-0 nova_compute[186662]: 2026-02-19 20:05:52.832 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:05:53 compute-0 nova_compute[186662]: 2026-02-19 20:05:53.325 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:05:53 compute-0 nova_compute[186662]: 2026-02-19 20:05:53.326 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:05:53 compute-0 nova_compute[186662]: 2026-02-19 20:05:53.326 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 20:05:53 compute-0 nova_compute[186662]: 2026-02-19 20:05:53.572 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:05:54 compute-0 nova_compute[186662]: 2026-02-19 20:05:54.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:05:56 compute-0 nova_compute[186662]: 2026-02-19 20:05:56.405 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:57 compute-0 nova_compute[186662]: 2026-02-19 20:05:57.577 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:05:58 compute-0 sshd-session[224678]: Received disconnect from 103.67.78.251 port 41584:11: Bye Bye [preauth]
Feb 19 20:05:58 compute-0 sshd-session[224678]: Disconnected from authenticating user root 103.67.78.251 port 41584 [preauth]
Feb 19 20:05:59 compute-0 podman[196025]: time="2026-02-19T20:05:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:05:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:05:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:05:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:05:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Feb 19 20:06:01 compute-0 openstack_network_exporter[198916]: ERROR   20:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:06:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:06:01 compute-0 openstack_network_exporter[198916]: ERROR   20:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:06:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:06:01 compute-0 nova_compute[186662]: 2026-02-19 20:06:01.431 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:02 compute-0 nova_compute[186662]: 2026-02-19 20:06:02.578 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:03 compute-0 nova_compute[186662]: 2026-02-19 20:06:03.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:06:03 compute-0 nova_compute[186662]: 2026-02-19 20:06:03.577 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Feb 19 20:06:06 compute-0 nova_compute[186662]: 2026-02-19 20:06:06.433 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:07 compute-0 podman[224680]: 2026-02-19 20:06:07.258432924 +0000 UTC m=+0.038386834 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 19 20:06:07 compute-0 nova_compute[186662]: 2026-02-19 20:06:07.580 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:11 compute-0 nova_compute[186662]: 2026-02-19 20:06:11.435 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:12 compute-0 podman[224701]: 2026-02-19 20:06:12.307565586 +0000 UTC m=+0.082668051 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Feb 19 20:06:12 compute-0 podman[224702]: 2026-02-19 20:06:12.313799878 +0000 UTC m=+0.081730728 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public)
Feb 19 20:06:12 compute-0 nova_compute[186662]: 2026-02-19 20:06:12.584 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:15 compute-0 nova_compute[186662]: 2026-02-19 20:06:15.084 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:06:15 compute-0 nova_compute[186662]: 2026-02-19 20:06:15.084 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Feb 19 20:06:15 compute-0 nova_compute[186662]: 2026-02-19 20:06:15.591 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Feb 19 20:06:16 compute-0 nova_compute[186662]: 2026-02-19 20:06:16.438 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:17 compute-0 podman[224745]: 2026-02-19 20:06:17.29387628 +0000 UTC m=+0.067849670 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 20:06:17 compute-0 nova_compute[186662]: 2026-02-19 20:06:17.587 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:21 compute-0 nova_compute[186662]: 2026-02-19 20:06:21.440 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:22 compute-0 nova_compute[186662]: 2026-02-19 20:06:22.587 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:26 compute-0 nova_compute[186662]: 2026-02-19 20:06:26.442 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:27 compute-0 nova_compute[186662]: 2026-02-19 20:06:27.621 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:29 compute-0 podman[196025]: time="2026-02-19T20:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:06:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:06:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Feb 19 20:06:31 compute-0 openstack_network_exporter[198916]: ERROR   20:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:06:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:06:31 compute-0 openstack_network_exporter[198916]: ERROR   20:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:06:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:06:31 compute-0 nova_compute[186662]: 2026-02-19 20:06:31.445 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:06:32.184 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:06:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:06:32.184 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:06:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:06:32.184 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:06:32 compute-0 nova_compute[186662]: 2026-02-19 20:06:32.623 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:36 compute-0 nova_compute[186662]: 2026-02-19 20:06:36.447 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:37 compute-0 nova_compute[186662]: 2026-02-19 20:06:37.626 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:38 compute-0 podman[224770]: 2026-02-19 20:06:38.260466832 +0000 UTC m=+0.042500914 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Feb 19 20:06:41 compute-0 nova_compute[186662]: 2026-02-19 20:06:41.448 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:42 compute-0 nova_compute[186662]: 2026-02-19 20:06:42.083 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:06:42 compute-0 nova_compute[186662]: 2026-02-19 20:06:42.676 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:43 compute-0 podman[224790]: 2026-02-19 20:06:43.275409774 +0000 UTC m=+0.051217387 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, version=9.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, vcs-type=git, vendor=Red Hat, Inc.)
Feb 19 20:06:43 compute-0 podman[224789]: 2026-02-19 20:06:43.345744013 +0000 UTC m=+0.120922450 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Feb 19 20:06:43 compute-0 nova_compute[186662]: 2026-02-19 20:06:43.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:06:45 compute-0 sshd-session[224835]: Received disconnect from 45.169.200.254 port 46022:11: Bye Bye [preauth]
Feb 19 20:06:45 compute-0 sshd-session[224835]: Disconnected from authenticating user root 45.169.200.254 port 46022 [preauth]
Feb 19 20:06:45 compute-0 nova_compute[186662]: 2026-02-19 20:06:45.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:06:46 compute-0 nova_compute[186662]: 2026-02-19 20:06:46.450 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:47 compute-0 nova_compute[186662]: 2026-02-19 20:06:47.678 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:48 compute-0 podman[224837]: 2026-02-19 20:06:48.276965159 +0000 UTC m=+0.056194997 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 19 20:06:48 compute-0 nova_compute[186662]: 2026-02-19 20:06:48.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:06:49 compute-0 nova_compute[186662]: 2026-02-19 20:06:49.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:06:50 compute-0 nova_compute[186662]: 2026-02-19 20:06:50.093 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:06:50 compute-0 nova_compute[186662]: 2026-02-19 20:06:50.094 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:06:50 compute-0 nova_compute[186662]: 2026-02-19 20:06:50.094 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:06:50 compute-0 nova_compute[186662]: 2026-02-19 20:06:50.094 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 20:06:50 compute-0 nova_compute[186662]: 2026-02-19 20:06:50.214 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 20:06:50 compute-0 nova_compute[186662]: 2026-02-19 20:06:50.216 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 20:06:50 compute-0 nova_compute[186662]: 2026-02-19 20:06:50.226 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 20:06:50 compute-0 nova_compute[186662]: 2026-02-19 20:06:50.227 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5803MB free_disk=72.96742630004883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 20:06:50 compute-0 nova_compute[186662]: 2026-02-19 20:06:50.227 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:06:50 compute-0 nova_compute[186662]: 2026-02-19 20:06:50.227 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:06:51 compute-0 nova_compute[186662]: 2026-02-19 20:06:51.279 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 20:06:51 compute-0 nova_compute[186662]: 2026-02-19 20:06:51.280 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:06:50 up  1:37,  0 user,  load average: 0.01, 0.07, 0.13\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 20:06:51 compute-0 nova_compute[186662]: 2026-02-19 20:06:51.301 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 20:06:51 compute-0 nova_compute[186662]: 2026-02-19 20:06:51.451 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:51 compute-0 nova_compute[186662]: 2026-02-19 20:06:51.809 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 20:06:52 compute-0 nova_compute[186662]: 2026-02-19 20:06:52.317 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 20:06:52 compute-0 nova_compute[186662]: 2026-02-19 20:06:52.317 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:06:52 compute-0 nova_compute[186662]: 2026-02-19 20:06:52.682 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:53 compute-0 nova_compute[186662]: 2026-02-19 20:06:53.318 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:06:53 compute-0 nova_compute[186662]: 2026-02-19 20:06:53.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:06:53 compute-0 nova_compute[186662]: 2026-02-19 20:06:53.575 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 20:06:55 compute-0 nova_compute[186662]: 2026-02-19 20:06:55.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:06:56 compute-0 nova_compute[186662]: 2026-02-19 20:06:56.453 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:56 compute-0 nova_compute[186662]: 2026-02-19 20:06:56.501 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:06:57 compute-0 nova_compute[186662]: 2026-02-19 20:06:57.682 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:06:59 compute-0 podman[196025]: time="2026-02-19T20:06:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:06:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:06:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:06:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:06:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2199 "" "Go-http-client/1.1"
Feb 19 20:07:01 compute-0 openstack_network_exporter[198916]: ERROR   20:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:07:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:07:01 compute-0 openstack_network_exporter[198916]: ERROR   20:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:07:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:07:01 compute-0 nova_compute[186662]: 2026-02-19 20:07:01.456 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:02 compute-0 nova_compute[186662]: 2026-02-19 20:07:02.682 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:06 compute-0 nova_compute[186662]: 2026-02-19 20:07:06.458 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:07 compute-0 nova_compute[186662]: 2026-02-19 20:07:07.685 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:09 compute-0 podman[224865]: 2026-02-19 20:07:09.281431754 +0000 UTC m=+0.053023705 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 19 20:07:11 compute-0 sshd-session[224884]: Invalid user iksi from 96.78.175.42 port 52670
Feb 19 20:07:11 compute-0 sshd-session[224884]: Received disconnect from 96.78.175.42 port 52670:11: Bye Bye [preauth]
Feb 19 20:07:11 compute-0 sshd-session[224884]: Disconnected from invalid user iksi 96.78.175.42 port 52670 [preauth]
Feb 19 20:07:11 compute-0 nova_compute[186662]: 2026-02-19 20:07:11.460 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:12 compute-0 nova_compute[186662]: 2026-02-19 20:07:12.687 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:14 compute-0 podman[224887]: 2026-02-19 20:07:14.270343961 +0000 UTC m=+0.047363068 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, version=9.7, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1770267347, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 19 20:07:14 compute-0 podman[224886]: 2026-02-19 20:07:14.293297257 +0000 UTC m=+0.072355233 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 19 20:07:16 compute-0 nova_compute[186662]: 2026-02-19 20:07:16.462 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:17 compute-0 nova_compute[186662]: 2026-02-19 20:07:17.691 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:19 compute-0 podman[224932]: 2026-02-19 20:07:19.26050846 +0000 UTC m=+0.040870090 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 20:07:21 compute-0 nova_compute[186662]: 2026-02-19 20:07:21.464 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:22 compute-0 nova_compute[186662]: 2026-02-19 20:07:22.692 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:26 compute-0 nova_compute[186662]: 2026-02-19 20:07:26.466 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:27 compute-0 nova_compute[186662]: 2026-02-19 20:07:27.695 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:29 compute-0 podman[196025]: time="2026-02-19T20:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:07:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:07:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Feb 19 20:07:31 compute-0 openstack_network_exporter[198916]: ERROR   20:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:07:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:07:31 compute-0 openstack_network_exporter[198916]: ERROR   20:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:07:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:07:31 compute-0 nova_compute[186662]: 2026-02-19 20:07:31.467 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:07:32.185 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:07:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:07:32.185 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:07:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:07:32.185 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:07:32 compute-0 nova_compute[186662]: 2026-02-19 20:07:32.696 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:36 compute-0 nova_compute[186662]: 2026-02-19 20:07:36.469 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:37 compute-0 nova_compute[186662]: 2026-02-19 20:07:37.699 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:40 compute-0 podman[224957]: 2026-02-19 20:07:40.280243184 +0000 UTC m=+0.054634014 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Feb 19 20:07:41 compute-0 nova_compute[186662]: 2026-02-19 20:07:41.472 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:42 compute-0 nova_compute[186662]: 2026-02-19 20:07:42.701 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:43 compute-0 nova_compute[186662]: 2026-02-19 20:07:43.084 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:07:45 compute-0 podman[224977]: 2026-02-19 20:07:45.280905715 +0000 UTC m=+0.058709812 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Feb 19 20:07:45 compute-0 podman[224976]: 2026-02-19 20:07:45.299414983 +0000 UTC m=+0.078291896 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Feb 19 20:07:45 compute-0 nova_compute[186662]: 2026-02-19 20:07:45.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:07:46 compute-0 nova_compute[186662]: 2026-02-19 20:07:46.474 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:47 compute-0 nova_compute[186662]: 2026-02-19 20:07:47.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:07:47 compute-0 nova_compute[186662]: 2026-02-19 20:07:47.702 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:49 compute-0 nova_compute[186662]: 2026-02-19 20:07:49.570 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:07:50 compute-0 podman[225022]: 2026-02-19 20:07:50.261548602 +0000 UTC m=+0.042614312 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 20:07:51 compute-0 nova_compute[186662]: 2026-02-19 20:07:51.476 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:51 compute-0 nova_compute[186662]: 2026-02-19 20:07:51.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:07:51 compute-0 nova_compute[186662]: 2026-02-19 20:07:51.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:07:52 compute-0 nova_compute[186662]: 2026-02-19 20:07:52.089 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:07:52 compute-0 nova_compute[186662]: 2026-02-19 20:07:52.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:07:52 compute-0 nova_compute[186662]: 2026-02-19 20:07:52.090 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:07:52 compute-0 nova_compute[186662]: 2026-02-19 20:07:52.090 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 20:07:52 compute-0 nova_compute[186662]: 2026-02-19 20:07:52.215 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 20:07:52 compute-0 nova_compute[186662]: 2026-02-19 20:07:52.216 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 20:07:52 compute-0 nova_compute[186662]: 2026-02-19 20:07:52.228 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 20:07:52 compute-0 nova_compute[186662]: 2026-02-19 20:07:52.228 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5804MB free_disk=72.96742630004883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 20:07:52 compute-0 nova_compute[186662]: 2026-02-19 20:07:52.229 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:07:52 compute-0 nova_compute[186662]: 2026-02-19 20:07:52.229 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:07:52 compute-0 nova_compute[186662]: 2026-02-19 20:07:52.703 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:53 compute-0 nova_compute[186662]: 2026-02-19 20:07:53.295 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 20:07:53 compute-0 nova_compute[186662]: 2026-02-19 20:07:53.296 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:07:52 up  1:39,  0 user,  load average: 0.00, 0.05, 0.11\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 20:07:53 compute-0 nova_compute[186662]: 2026-02-19 20:07:53.379 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 20:07:53 compute-0 nova_compute[186662]: 2026-02-19 20:07:53.887 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 20:07:54 compute-0 nova_compute[186662]: 2026-02-19 20:07:54.396 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 20:07:54 compute-0 nova_compute[186662]: 2026-02-19 20:07:54.396 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.167s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:07:56 compute-0 nova_compute[186662]: 2026-02-19 20:07:56.396 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:07:56 compute-0 nova_compute[186662]: 2026-02-19 20:07:56.478 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:56 compute-0 nova_compute[186662]: 2026-02-19 20:07:56.904 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:07:56 compute-0 nova_compute[186662]: 2026-02-19 20:07:56.905 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:07:56 compute-0 nova_compute[186662]: 2026-02-19 20:07:56.905 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 20:07:57 compute-0 nova_compute[186662]: 2026-02-19 20:07:57.705 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:07:59 compute-0 podman[196025]: time="2026-02-19T20:07:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:07:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:07:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:07:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:07:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2196 "" "Go-http-client/1.1"
Feb 19 20:08:01 compute-0 openstack_network_exporter[198916]: ERROR   20:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:08:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:08:01 compute-0 openstack_network_exporter[198916]: ERROR   20:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:08:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:08:01 compute-0 nova_compute[186662]: 2026-02-19 20:08:01.479 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:02 compute-0 nova_compute[186662]: 2026-02-19 20:08:02.706 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:06 compute-0 nova_compute[186662]: 2026-02-19 20:08:06.481 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:07 compute-0 nova_compute[186662]: 2026-02-19 20:08:07.708 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:11 compute-0 podman[225047]: 2026-02-19 20:08:11.259241724 +0000 UTC m=+0.036330531 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 19 20:08:11 compute-0 nova_compute[186662]: 2026-02-19 20:08:11.484 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:12 compute-0 nova_compute[186662]: 2026-02-19 20:08:12.709 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:16 compute-0 podman[225067]: 2026-02-19 20:08:16.281782656 +0000 UTC m=+0.054527121 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, config_id=openstack_network_exporter, managed_by=edpm_ansible, version=9.7, io.buildah.version=1.33.7)
Feb 19 20:08:16 compute-0 podman[225066]: 2026-02-19 20:08:16.296551384 +0000 UTC m=+0.071799839 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 19 20:08:16 compute-0 nova_compute[186662]: 2026-02-19 20:08:16.485 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:17 compute-0 nova_compute[186662]: 2026-02-19 20:08:17.710 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:21 compute-0 podman[225113]: 2026-02-19 20:08:21.262355951 +0000 UTC m=+0.039934817 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 20:08:21 compute-0 nova_compute[186662]: 2026-02-19 20:08:21.487 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:22 compute-0 nova_compute[186662]: 2026-02-19 20:08:22.713 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:26 compute-0 nova_compute[186662]: 2026-02-19 20:08:26.488 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:27 compute-0 nova_compute[186662]: 2026-02-19 20:08:27.713 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:29 compute-0 podman[196025]: time="2026-02-19T20:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:08:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:08:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2197 "" "Go-http-client/1.1"
Feb 19 20:08:31 compute-0 openstack_network_exporter[198916]: ERROR   20:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:08:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:08:31 compute-0 openstack_network_exporter[198916]: ERROR   20:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:08:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:08:31 compute-0 nova_compute[186662]: 2026-02-19 20:08:31.489 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:08:32.186 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:08:32.186 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:08:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:08:32.187 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:08:32 compute-0 nova_compute[186662]: 2026-02-19 20:08:32.716 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:36 compute-0 nova_compute[186662]: 2026-02-19 20:08:36.492 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:37 compute-0 nova_compute[186662]: 2026-02-19 20:08:37.718 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:41 compute-0 nova_compute[186662]: 2026-02-19 20:08:41.495 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:42 compute-0 podman[225138]: 2026-02-19 20:08:42.265152076 +0000 UTC m=+0.041606929 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, tcib_build_tag=watcher_latest, tcib_managed=true)
Feb 19 20:08:42 compute-0 nova_compute[186662]: 2026-02-19 20:08:42.576 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:08:42 compute-0 nova_compute[186662]: 2026-02-19 20:08:42.719 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:46 compute-0 nova_compute[186662]: 2026-02-19 20:08:46.497 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:46 compute-0 nova_compute[186662]: 2026-02-19 20:08:46.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:08:47 compute-0 podman[225159]: 2026-02-19 20:08:47.286604041 +0000 UTC m=+0.062067892 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1770267347, name=ubi9/ubi-minimal, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=)
Feb 19 20:08:47 compute-0 podman[225158]: 2026-02-19 20:08:47.299455613 +0000 UTC m=+0.080060059 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Feb 19 20:08:47 compute-0 nova_compute[186662]: 2026-02-19 20:08:47.722 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:48 compute-0 nova_compute[186662]: 2026-02-19 20:08:48.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:08:49 compute-0 nova_compute[186662]: 2026-02-19 20:08:49.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:08:51 compute-0 nova_compute[186662]: 2026-02-19 20:08:51.499 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:51 compute-0 nova_compute[186662]: 2026-02-19 20:08:51.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:08:52 compute-0 podman[225203]: 2026-02-19 20:08:52.275759346 +0000 UTC m=+0.052595124 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 20:08:52 compute-0 nova_compute[186662]: 2026-02-19 20:08:52.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:08:52 compute-0 nova_compute[186662]: 2026-02-19 20:08:52.723 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:53 compute-0 nova_compute[186662]: 2026-02-19 20:08:53.100 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:08:53 compute-0 nova_compute[186662]: 2026-02-19 20:08:53.100 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:08:53 compute-0 nova_compute[186662]: 2026-02-19 20:08:53.100 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:08:53 compute-0 nova_compute[186662]: 2026-02-19 20:08:53.100 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 20:08:53 compute-0 nova_compute[186662]: 2026-02-19 20:08:53.215 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 20:08:53 compute-0 nova_compute[186662]: 2026-02-19 20:08:53.216 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 20:08:53 compute-0 nova_compute[186662]: 2026-02-19 20:08:53.227 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 20:08:53 compute-0 nova_compute[186662]: 2026-02-19 20:08:53.227 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5796MB free_disk=72.96744155883789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 20:08:53 compute-0 nova_compute[186662]: 2026-02-19 20:08:53.227 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:08:53 compute-0 nova_compute[186662]: 2026-02-19 20:08:53.227 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:08:54 compute-0 nova_compute[186662]: 2026-02-19 20:08:54.289 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 20:08:54 compute-0 nova_compute[186662]: 2026-02-19 20:08:54.290 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:08:53 up  1:40,  0 user,  load average: 0.00, 0.04, 0.10\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 20:08:54 compute-0 nova_compute[186662]: 2026-02-19 20:08:54.306 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 20:08:54 compute-0 nova_compute[186662]: 2026-02-19 20:08:54.813 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 20:08:54 compute-0 sshd-session[225228]: Invalid user odoo15 from 139.59.133.246 port 35356
Feb 19 20:08:54 compute-0 sshd-session[225228]: Received disconnect from 139.59.133.246 port 35356:11: Bye Bye [preauth]
Feb 19 20:08:54 compute-0 sshd-session[225228]: Disconnected from invalid user odoo15 139.59.133.246 port 35356 [preauth]
Feb 19 20:08:55 compute-0 nova_compute[186662]: 2026-02-19 20:08:55.323 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 20:08:55 compute-0 nova_compute[186662]: 2026-02-19 20:08:55.324 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.096s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:08:56 compute-0 nova_compute[186662]: 2026-02-19 20:08:56.501 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:57 compute-0 nova_compute[186662]: 2026-02-19 20:08:57.725 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:08:58 compute-0 nova_compute[186662]: 2026-02-19 20:08:58.324 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:08:58 compute-0 nova_compute[186662]: 2026-02-19 20:08:58.326 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:08:58 compute-0 nova_compute[186662]: 2026-02-19 20:08:58.326 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 20:08:59 compute-0 podman[196025]: time="2026-02-19T20:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:08:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:08:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2193 "" "Go-http-client/1.1"
Feb 19 20:09:01 compute-0 openstack_network_exporter[198916]: ERROR   20:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:09:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:09:01 compute-0 openstack_network_exporter[198916]: ERROR   20:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:09:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:09:01 compute-0 nova_compute[186662]: 2026-02-19 20:09:01.504 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:02 compute-0 nova_compute[186662]: 2026-02-19 20:09:02.726 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:06 compute-0 nova_compute[186662]: 2026-02-19 20:09:06.506 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:07 compute-0 nova_compute[186662]: 2026-02-19 20:09:07.728 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:11 compute-0 nova_compute[186662]: 2026-02-19 20:09:11.508 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:12 compute-0 nova_compute[186662]: 2026-02-19 20:09:12.732 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:13 compute-0 podman[225230]: 2026-02-19 20:09:13.263599519 +0000 UTC m=+0.035810998 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 19 20:09:16 compute-0 nova_compute[186662]: 2026-02-19 20:09:16.509 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:17 compute-0 nova_compute[186662]: 2026-02-19 20:09:17.733 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:18 compute-0 podman[225251]: 2026-02-19 20:09:18.272740106 +0000 UTC m=+0.043645067 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, vcs-type=git, architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 19 20:09:18 compute-0 podman[225250]: 2026-02-19 20:09:18.286402577 +0000 UTC m=+0.062951665 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 19 20:09:21 compute-0 nova_compute[186662]: 2026-02-19 20:09:21.510 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:22 compute-0 nova_compute[186662]: 2026-02-19 20:09:22.734 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:23 compute-0 podman[225297]: 2026-02-19 20:09:23.288551805 +0000 UTC m=+0.060070654 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 19 20:09:26 compute-0 nova_compute[186662]: 2026-02-19 20:09:26.512 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:27 compute-0 nova_compute[186662]: 2026-02-19 20:09:27.736 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:29 compute-0 podman[196025]: time="2026-02-19T20:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:09:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:09:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2195 "" "Go-http-client/1.1"
Feb 19 20:09:31 compute-0 openstack_network_exporter[198916]: ERROR   20:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:09:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:09:31 compute-0 openstack_network_exporter[198916]: ERROR   20:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:09:31 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:09:31 compute-0 nova_compute[186662]: 2026-02-19 20:09:31.512 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:09:32.187 105986 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:09:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:09:32.188 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:09:32 compute-0 ovn_metadata_agent[105981]: 2026-02-19 20:09:32.188 105986 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:09:32 compute-0 nova_compute[186662]: 2026-02-19 20:09:32.741 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:36 compute-0 nova_compute[186662]: 2026-02-19 20:09:36.514 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:37 compute-0 nova_compute[186662]: 2026-02-19 20:09:37.740 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:41 compute-0 nova_compute[186662]: 2026-02-19 20:09:41.516 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:42 compute-0 nova_compute[186662]: 2026-02-19 20:09:42.742 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:43 compute-0 nova_compute[186662]: 2026-02-19 20:09:43.577 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:09:44 compute-0 podman[225323]: 2026-02-19 20:09:44.274684559 +0000 UTC m=+0.049948561 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 19 20:09:46 compute-0 nova_compute[186662]: 2026-02-19 20:09:46.518 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:47 compute-0 nova_compute[186662]: 2026-02-19 20:09:47.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:09:47 compute-0 nova_compute[186662]: 2026-02-19 20:09:47.744 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:49 compute-0 podman[225344]: 2026-02-19 20:09:49.292699822 +0000 UTC m=+0.061700285 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1770267347, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container)
Feb 19 20:09:49 compute-0 podman[225343]: 2026-02-19 20:09:49.319442719 +0000 UTC m=+0.092527911 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Feb 19 20:09:49 compute-0 nova_compute[186662]: 2026-02-19 20:09:49.571 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:09:49 compute-0 nova_compute[186662]: 2026-02-19 20:09:49.574 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:09:51 compute-0 nova_compute[186662]: 2026-02-19 20:09:51.520 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:51 compute-0 sshd-session[225391]: Invalid user teamspeak3 from 103.67.78.251 port 40132
Feb 19 20:09:51 compute-0 sshd-session[225391]: Received disconnect from 103.67.78.251 port 40132:11: Bye Bye [preauth]
Feb 19 20:09:51 compute-0 sshd-session[225391]: Disconnected from invalid user teamspeak3 103.67.78.251 port 40132 [preauth]
Feb 19 20:09:52 compute-0 nova_compute[186662]: 2026-02-19 20:09:52.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:09:52 compute-0 nova_compute[186662]: 2026-02-19 20:09:52.575 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:09:52 compute-0 nova_compute[186662]: 2026-02-19 20:09:52.745 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:53 compute-0 nova_compute[186662]: 2026-02-19 20:09:53.146 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:09:53 compute-0 nova_compute[186662]: 2026-02-19 20:09:53.147 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:09:53 compute-0 nova_compute[186662]: 2026-02-19 20:09:53.147 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:09:53 compute-0 nova_compute[186662]: 2026-02-19 20:09:53.148 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Feb 19 20:09:53 compute-0 nova_compute[186662]: 2026-02-19 20:09:53.311 186666 WARNING nova.virt.libvirt.driver [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 19 20:09:53 compute-0 nova_compute[186662]: 2026-02-19 20:09:53.312 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Feb 19 20:09:53 compute-0 nova_compute[186662]: 2026-02-19 20:09:53.325 186666 DEBUG oslo_concurrency.processutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Feb 19 20:09:53 compute-0 nova_compute[186662]: 2026-02-19 20:09:53.326 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5799MB free_disk=72.96744155883789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Feb 19 20:09:53 compute-0 nova_compute[186662]: 2026-02-19 20:09:53.327 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Feb 19 20:09:53 compute-0 nova_compute[186662]: 2026-02-19 20:09:53.327 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Feb 19 20:09:54 compute-0 podman[225394]: 2026-02-19 20:09:54.272368468 +0000 UTC m=+0.050899964 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 19 20:09:54 compute-0 nova_compute[186662]: 2026-02-19 20:09:54.366 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Feb 19 20:09:54 compute-0 nova_compute[186662]: 2026-02-19 20:09:54.366 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:09:53 up  1:41,  0 user,  load average: 0.00, 0.03, 0.09\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Feb 19 20:09:54 compute-0 nova_compute[186662]: 2026-02-19 20:09:54.465 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing inventories for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Feb 19 20:09:54 compute-0 nova_compute[186662]: 2026-02-19 20:09:54.478 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating ProviderTree inventory for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Feb 19 20:09:54 compute-0 nova_compute[186662]: 2026-02-19 20:09:54.478 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Updating inventory in ProviderTree for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Feb 19 20:09:54 compute-0 nova_compute[186662]: 2026-02-19 20:09:54.490 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing aggregate associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Feb 19 20:09:54 compute-0 nova_compute[186662]: 2026-02-19 20:09:54.504 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Refreshing trait associations for resource provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_SSE2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_IDE,HW_ARCH_X86_64,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_USB _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Feb 19 20:09:54 compute-0 nova_compute[186662]: 2026-02-19 20:09:54.526 186666 DEBUG nova.compute.provider_tree [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed in ProviderTree for provider: 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Feb 19 20:09:55 compute-0 nova_compute[186662]: 2026-02-19 20:09:55.033 186666 DEBUG nova.scheduler.client.report [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Inventory has not changed for provider 11ecaf50-b8a2-48b5-a41c-a8b0b10798d6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Feb 19 20:09:55 compute-0 nova_compute[186662]: 2026-02-19 20:09:55.548 186666 DEBUG nova.compute.resource_tracker [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Feb 19 20:09:55 compute-0 nova_compute[186662]: 2026-02-19 20:09:55.548 186666 DEBUG oslo_concurrency.lockutils [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.221s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Feb 19 20:09:56 compute-0 nova_compute[186662]: 2026-02-19 20:09:56.522 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:57 compute-0 nova_compute[186662]: 2026-02-19 20:09:57.746 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:09:59 compute-0 nova_compute[186662]: 2026-02-19 20:09:59.544 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:09:59 compute-0 podman[196025]: time="2026-02-19T20:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:09:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:09:59 compute-0 podman[196025]: @ - - [19/Feb/2026:20:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2200 "" "Go-http-client/1.1"
Feb 19 20:10:00 compute-0 nova_compute[186662]: 2026-02-19 20:10:00.052 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:10:00 compute-0 nova_compute[186662]: 2026-02-19 20:10:00.052 186666 DEBUG oslo_service.periodic_task [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Feb 19 20:10:00 compute-0 nova_compute[186662]: 2026-02-19 20:10:00.052 186666 DEBUG nova.compute.manager [None req-7c68f1c2-f82c-4ecd-9f29-4ab77c1de840 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Feb 19 20:10:01 compute-0 openstack_network_exporter[198916]: ERROR   20:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 19 20:10:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:10:01 compute-0 openstack_network_exporter[198916]: ERROR   20:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 19 20:10:01 compute-0 openstack_network_exporter[198916]: 
Feb 19 20:10:01 compute-0 nova_compute[186662]: 2026-02-19 20:10:01.523 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:10:02 compute-0 nova_compute[186662]: 2026-02-19 20:10:02.747 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:10:06 compute-0 nova_compute[186662]: 2026-02-19 20:10:06.525 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:10:07 compute-0 nova_compute[186662]: 2026-02-19 20:10:07.749 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:10:11 compute-0 nova_compute[186662]: 2026-02-19 20:10:11.527 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:10:12 compute-0 nova_compute[186662]: 2026-02-19 20:10:12.752 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:10:15 compute-0 podman[225419]: 2026-02-19 20:10:15.281457297 +0000 UTC m=+0.055964896 container health_status 1099d2d049928d569ec65c1d23debc357534d47fe3d9d81ef56cd1af739398c1 (image=38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 19 20:10:16 compute-0 nova_compute[186662]: 2026-02-19 20:10:16.530 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:10:17 compute-0 nova_compute[186662]: 2026-02-19 20:10:17.755 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:10:19 compute-0 sshd-session[225439]: Invalid user n8n from 96.78.175.42 port 41398
Feb 19 20:10:19 compute-0 sshd-session[225439]: Received disconnect from 96.78.175.42 port 41398:11: Bye Bye [preauth]
Feb 19 20:10:19 compute-0 sshd-session[225439]: Disconnected from invalid user n8n 96.78.175.42 port 41398 [preauth]
Feb 19 20:10:20 compute-0 sshd-session[225441]: Accepted publickey for zuul from 192.168.122.10 port 48654 ssh2: ECDSA SHA256:CbHEcdYnGzya4dnWSa+K2wM0tHLLhSzcIxnybSMJF9c
Feb 19 20:10:20 compute-0 systemd-logind[822]: New session 43 of user zuul.
Feb 19 20:10:20 compute-0 systemd[1]: Started Session 43 of User zuul.
Feb 19 20:10:20 compute-0 sshd-session[225441]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 19 20:10:20 compute-0 podman[225444]: 2026-02-19 20:10:20.055180963 +0000 UTC m=+0.043318720 container health_status 7bb6e0332e55d06f93b8303dc0d826f985f2cb0f488c67704183c1c7465cd397 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible)
Feb 19 20:10:20 compute-0 podman[225443]: 2026-02-19 20:10:20.080056505 +0000 UTC m=+0.070881977 container health_status 57d4693e61d56f4d570ea68f9b5d6e3da94ea6bb583d2662b7baf56f5c9bae7e (image=38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f85f0c6f14c53fd468b70f5d81e923cb743e203d17a5c25373ee86898c006d3e-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.75:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.43.0)
Feb 19 20:10:20 compute-0 sudo[225493]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 19 20:10:20 compute-0 sudo[225493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 19 20:10:21 compute-0 nova_compute[186662]: 2026-02-19 20:10:21.531 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:10:22 compute-0 nova_compute[186662]: 2026-02-19 20:10:22.758 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:10:23 compute-0 ovs-vsctl[225662]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 19 20:10:24 compute-0 virtqemud[186157]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 19 20:10:24 compute-0 virtqemud[186157]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 19 20:10:24 compute-0 virtqemud[186157]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 19 20:10:24 compute-0 podman[225852]: 2026-02-19 20:10:24.7553749 +0000 UTC m=+0.057150645 container health_status a8bf73042c5b48a46ddbac44a5cdca099917955be0cfd24ee08a25adf2492fb9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '743a99b3721dd32ac313e0c4ded11800ee526bd9dc1f2708328ef4bbbac8834b-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 19 20:10:25 compute-0 crontab[226085]: (root) LIST (root)
Feb 19 20:10:26 compute-0 nova_compute[186662]: 2026-02-19 20:10:26.533 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:10:26 compute-0 systemd[1]: Starting Hostname Service...
Feb 19 20:10:27 compute-0 systemd[1]: Started Hostname Service.
Feb 19 20:10:27 compute-0 sshd-session[226154]: Received disconnect from 45.169.200.254 port 51712:11: Bye Bye [preauth]
Feb 19 20:10:27 compute-0 sshd-session[226154]: Disconnected from authenticating user root 45.169.200.254 port 51712 [preauth]
Feb 19 20:10:27 compute-0 sshd-session[226161]: Invalid user spadon from 132.248.170.105 port 52829
Feb 19 20:10:27 compute-0 sshd-session[226161]: Received disconnect from 132.248.170.105 port 52829:11: Bye Bye [preauth]
Feb 19 20:10:27 compute-0 sshd-session[226161]: Disconnected from invalid user spadon 132.248.170.105 port 52829 [preauth]
Feb 19 20:10:27 compute-0 nova_compute[186662]: 2026-02-19 20:10:27.759 186666 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Feb 19 20:10:29 compute-0 podman[196025]: time="2026-02-19T20:10:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 19 20:10:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:10:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15998 "" "Go-http-client/1.1"
Feb 19 20:10:29 compute-0 podman[196025]: @ - - [19/Feb/2026:20:10:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2196 "" "Go-http-client/1.1"
